Legislation – Online Safety Act 2023
Which version?
Latest available (Revised)
Original (As enacted)
Changes to legislation:
There are currently no known outstanding effects for the Online Safety Act 2023, Cross Heading: Illegal content duties for search services.![]()
Changes to Legislation
Revised legislation carried on this site may not be fully up to date. At the current time any known changes or effects made by subsequent legislation have been applied to the text of the legislation you are viewing by the editorial team. Please see ‘Frequently Asked Questions’ for details regarding the timescales for which new effects are identified and recorded on this site.
PART 3Providers of regulated user-to-user services and regulated search services: duties of care
CHAPTER 3Providers of search services: duties of care
Illegal content duties for search services
26Illegal content risk assessment duties
(1)
This section sets out the duties about risk assessments which apply in relation to all regulated search services.
(2)
A duty to carry out a suitable and sufficient illegal content risk assessment at a time set out in, or as provided by, Schedule 3.
(3)
A duty to take appropriate steps to keep an illegal content risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.
(4)
Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient illegal content risk assessment relating to the impacts of that proposed change.
(5)
An “illegal content risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a)
the level of risk of individuals who are users of the service encountering search content of the following kinds—
(i)
each kind of priority illegal content (with each kind separately assessed), and
(ii)
other illegal content,
taking into account (in particular) risks presented by algorithms used by the service, and the way that the service indexes, organises and presents search results;
(b)
the level of risk of functionalities of the service facilitating individuals encountering search content that is illegal content, identifying and assessing those functionalities that present higher levels of risk;
(c)
the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (a) and (b);
(d)
how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.
(6)
In this section references to risk profiles are to the risk profiles for the time being published under section 98 which relate to the risk of harm to individuals presented by illegal content.
(7)
See also—
(b)
Schedule 3 (timing of providers’ assessments).
27Safety duties about illegal content
(1)
This section sets out the duties about illegal content which apply in relation to regulated search services (as indicated by the headings).
All services
(2)
A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service (see section 26(5)(c)).
(3)
A duty to operate a service using proportionate systems and processes designed to minimise the risk of individuals encountering search content of the following kinds—
(a)
priority illegal content;
(b)
other illegal content that the provider knows about (having been alerted to it by another person or become aware of it in any other way).
(4)
The duties set out in subsections (2) and (3) apply across all areas of a service, including the way the search engine is designed, operated and used as well as search content of the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—
(a)
regulatory compliance and risk management arrangements,
(b)
design of functionalities, algorithms and other features relating to the search engine,
(c)
functionalities allowing users to control the content they encounter in search results,
(d)
content prioritisation,
(e)
user support measures, and
(f)
staff policies and practices.
(5)
A duty to include provisions in a publicly available statement specifying how individuals are to be protected from search content that is illegal content.
(6)
A duty to apply the provisions of the statement referred to in subsection (5) consistently.
(7)
(8)
Additional duty for Category 2A services
(9)
A duty to summarise in a publicly available statement the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).
Interpretation
(10)
In determining what is proportionate for the purposes of this section, the following factors, in particular, are relevant—
(a)
all the findings of the most recent illegal content risk assessment (including as to levels of risk and as to nature, and severity, of potential harm to individuals), and
(b)
the size and capacity of the provider of a service.
(11)
In this section “illegal content risk assessment” has the meaning given by section 26.
(12)
See also, in relation to duties set out in this section, section 33 (duties about freedom of expression and privacy).