Next in our series of ebriefings on the Government’s Green Paper: Transforming public procurement; looking at the Chapter 4 proposal to change the basis of contract awards.
It has become common practice for many clients, especially in the housing and health and social care sectors, to have a Facebook profile (or other social media profile) where the organisation can share news and allow customers and the public to interact and leave comments.
Social media provides an excellent opportunity to modernise the way you interact with customers, but how they interact with you can sometimes cause difficulties. For example, what should you do if a customer leaves an abusive comment in respect of a member of staff or another customer? Or if customers start abusing each other on your Facebook page? Surely, as an organisation you can’t be held liable for defamatory comments made by others on your Facebook page, or can you?
Two cases provide some guidance to organisations as to their potential liability.
Watts v Times Newspapers Ltd confirmed that liability for defamatory material extends to any person who participates in or authorises its publication. Of course, this will include persons such as authors and editors, but it may also apply to those who were involved in disseminating it, despite it not coming from them originally. Therefore, there is a potential risk that an organisation could become liable for the content posted by a customer or third party.
In a traditional’ claim brought against the ‘author’ of a defamatory statement and/or a person who has published it, the question of responsibility for publication will usually be relatively easy to answer, i.e. who wrote or said it. However, understanding who is legally responsible for material published online can be difficult due to the ease of hiding identity. If you can identify them, is it viable to sue them? Alternatively, is it better to seek damages from an inadvertent publisher who has failed to act to remove the offending statement?
In Tamiz v Google Inc. the Court of Appeal found that it could be inferred that once the host (which would potentially include an organisation who has a social media group/page) is made aware of defamatory material, or if by the exercise of reasonable care, they should know that the material is defamatory, they become responsible for the continued publication of that material if they refrain from removing it. However, before ‘notification’, the host is not considered to be a publisher.
In Tamiz, the inference of responsibility did not arise until Google Inc. had a ‘reasonable time within which to act to remove the defamatory comments’. In this instance the time period was deemed to be five weeks, but this will vary depending on the circumstances of each case and could be considerably less for a smaller organisation than Google Inc.
Although the Court’s reasoning in Tamiz provides general guidance, the position is still relatively unclear, particularly for organisations operating as group/page administrators. For instance, a company that actively moderates its Facebook group/page (for example, as an administrator) is more likely to be found to be a publisher quicker than one that does not because they are more likely to know, or ought reasonably to know, that a publication is defamatory.
Therefore, to reduce the risk of liability, it is advisable to err on the side of caution by regarding yourself as a ‘publisher’ (even though this may not be the case).
We are increasingly being asked to advise clients in respect of comments made on their social media platforms and therefore to protect yourself from potential defamatory claims or adverse publicity, here are some points to remember:
- If practical, set out clear behavioural standards for group/page members to agree before joining. If this is not possible, consider placing some terms and conditions of behaviour within the “About Us” (or equivalent) section of the group/page.
- Consider implementing the membership approval function to control the number of individuals who have access to your group/page.
- If members breach the standards (e.g. by posting defamatory material, data breaches or offensive posts etc.), as the group admin, you can, and possibly should, as quickly as possible delete the post to limit the risk of potential liability.
- Depending on the nature of the post you should consider warning the individual that further acts of this kind will result in them being blocked or removed from the group/page.
- If the individual’s behaviour persists, and/or if the nature of the post warrants, you should block or remove the individual from the group/page without notice or further warning.
- In more extreme cases you could report the post to Facebook, who will decide whether to remove the user entirely from the platform. If the post is quite clearly offensive or aggressive, you may also want to consider reporting it to the police.
- Depending on your resources and the nature of the posts within your group/page, you could implement the Facebook tool allowing you to approve posts before publication. However, it does not mean that if you see material, albeit pre-approval, that is defamatory or offensive that you should ignore the above points. Bear in mind that in this instance, you would be classed as a publisher far quicker because of your active management of the publication of posts.
There is, of course, a degree of proportionality to consider when administrating social media groups/pages and a ‘one-size-fits-all’ approach doesn’t apply. However, adopting some, or all, of the best practice set out above will assist to limit your risk of getting embroiled in a potential defamation action by no fault of your own.
The Academies Financial Handbook is updated annually by the Department for Education and the Education and Skills Funding Agency; it contains a number of governance requirements for academy trusts.
Supreme Court publishes key decision for those working in the UK’s gig economy.
The 'Chocolate Snowman Appeal' is an amazing initiative that Anthony Collins Solicitors' (ACS) employees take part in every year.
The Building Safety Bill (the Bill) is said to be the most significant and wide-ranging change to the regulatory environment for higher risk building (HRBs) for over 45 years.
On 4 November 2020, the Restriction of Public Exit Payments Regulations 2020 (the Regulations) came into force; exit payments for the public sector were capped at £95,000.
The case was brought by the Official Receiver who sought disqualification orders under section 6 of the Company Directors Disqualification Act 1986 (CDDA 1986) against the seven trustees of Kids Company and its CEO. It illustrates well the tension between the role of a fulltime paid CEO of a large charity and the role of its board as voluntary trustees/directors.
At the end of 2020, The Charity Governance Code was updated or 'refreshed' as it is termed on its website.
Anthony Collins Solicitors is today (Thursday 11 February) revealing the scale of its social impact during 2020.
In their first podcast of this series, current and future trainees will discuss their journey and route to securing a training contract at Anthony Collins Solicitors.
To receive invitations to our events, as well as information and articles on legal issues and sector developments that are of interest to you, please sign up to Newsroom.