Legislation – Online Safety Act 2023
Changes to legislation:
There are currently no known outstanding effects for the Online Safety Act 2023, Section 28.![]()
Changes to Legislation
Revised legislation carried on this site may not be fully up to date. At the current time any known changes or effects made by subsequent legislation have been applied to the text of the legislation you are viewing by the editorial team. Please see ‘Frequently Asked Questions’ for details regarding the timescales for which new effects are identified and recorded on this site.
PART 3Providers of regulated user-to-user services and regulated search services: duties of care
CHAPTER 3Providers of search services: duties of care
Search services likely to be accessed by children
28Children’s risk assessment duties
(1)
This section sets out the duties about risk assessments which apply in relation to regulated search services that are likely to be accessed by children (in addition to the duties about risk assessments set out in section 26).
(2)
A duty to carry out a suitable and sufficient children’s risk assessment at a time set out in, or as provided by, Schedule 3.
(3)
A duty to take appropriate steps to keep a children’s risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.
(4)
Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient children’s risk assessment relating to the impacts of that proposed change.
(5)
A “children’s risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a)
the level of risk of children who are users of the service encountering search content of the following kinds—
(i)
each kind of primary priority content that is harmful to children (with each kind separately assessed),
(ii)
each kind of priority content that is harmful to children (with each kind separately assessed), and
(iii)
non-designated content that is harmful to children,
giving separate consideration to children in different age groups, and taking into account (in particular) risks presented by algorithms used by the service and the way that the service indexes, organises and presents search results;
(b)
the level of risk of children who are users of the service encountering search content that is harmful to children which particularly affects individuals with a certain characteristic or members of a certain group;
(c)
the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk, including a functionality that makes suggestions relating to users’ search requests (predictive search functionality);
(d)
the different ways in which the service is used, including functionalities or other features of the service that affect how much children use the service, and the impact of such use on the level of risk of harm that might be suffered by children;
(e)
(f)
how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.
(6)
In this section references to risk profiles are to the risk profiles for the time being published under section 98 which relate to the risk of harm to children presented by content that is harmful to children.
(7)
See also—
(b)
Schedule 3 (timing of providers’ assessments).