top of page

Data (Use and Access) Act 2025

Updated: Aug 7

This affects X, YouTube, TikTok and all social media platforms, including Google Drive, Dropbox, and Apple iCloud, which means we have lost free access to our own online data. Now we cannot speak because the government are, as I predicted, trawling through everything we write and everything we have ever written on the Internet. This act has made us all into criminals for everything we have ever or will ever say.


And check this video which explains that this Act was due ,apparently, entirely due to the input from the Carnegie Foundation. If you want to find out some interesting information about 'Carnegie' i would recommend a visit to the Carnegie Museum in Dunfermline Scotland.


Philanthropic Philosophy:

Carnegie's belief that wealth should be used for the public good, rather than personal accumulation, is a central tenet of his legacy.

Founding of Institutions:

Carnegie established numerous institutions like libraries, universities, and peace foundations, which have had a lasting impact on society.

Inspiring Others:

His writings and actions have inspired countless individuals to pursue philanthropy and social responsibility.

Incidentally some Carnegies are Jews ✡️



What does AI have to say?


AI Overview

+7

The "Online Safety Act" and the "Data (Use and Access) Act 2025" are two separate pieces of UK legislation. The Online Safety Act focuses on regulating illegal and harmful content online, particularly for children. The Data (Use and Access) Act 2025, also known as the "Data Act", focuses on data access and use, including digital identity verification and new Smart Data schemes. 

  • Purpose: To protect children and adults from illegal and harmful content online. 

  • Key Provisions:

    • Imposes duties on social media companies and search services to protect users from illegal content. 

    • Requires platforms to take down illegal content and implement systems to reduce the risk of illegal activity. 

    • Focuses on protecting children from harmful content, including age-inappropriate content and content that encourages self-harm or suicide. 

    • Ofcom is responsible for enforcing these duties. 

  • Timeline: Received Royal Assent on 26 October 2023, with some provisions already in effect. 

Data (Use and Access) Act 2025 (Data Act):

  • Purpose:

    To enable growth of digital verification services, new Smart Data schemes, and a new National Underground Asset Register, while also updating data protection and privacy legislation. 

  • Key Provisions:

    • Enables digital identity trust framework. 

    • Introduces new Smart Data schemes like Open Banking

    • Changes data protection laws to promote innovation and economic growth. 

    • Includes provisions for fair access to and use of data. 

  • Timeline:

    Received Royal Assent on 19 June 2025, with changes being phased in between June 2025 and June 2026, according to the Information Commissioner's Office

  • Key aspects:

    • Prohibits unfair contractual terms regarding data access and use, according to Bird & Bird

    • Applies to both personal and non-personal data accessed and used based on contracts between businesses, according to Bird & Bird. 

Key Differences:

  • Scope:

    The Online Safety Act focuses on regulating content and user safety online, while the Data Act focuses on data access, use, and digital identity. 

  • Enforcement:

    Ofcom is responsible for enforcing the Online Safety Act, while the Information Commissioner's Office (ICO) plays a key role in implementing and enforcing the Data Act. 

In essence, the Online Safety Act aims to create a safer online environment, while the Data Act aims to unlock the potential of data for innovation and economic growth, while also ensuring data protection and privacy. 


Just remember the Carnegie UK Trust 😲😲😲😲


So more from AI


AI Overview



Carnegie UK played a SIGNIFICANT role in shaping the UK's Online Safety Act, particularly in advocating for a systemic approach to online harms. They proposed a duty of care for online platforms, overseen by a regulator, to address reasonably foreseeable harms. This approach, which focuses on platform design and operation rather than individual cases, was later adopted by the UK government in its Online Harms White Paper, according to evidence submitted to UK Parliament.


Do you want to see what the submission said?


Written evidence from Carnegie UK Trust (TEC 36)

Public Administration and Constitutional Affairs Committee

The Work of the Electoral Commission inquiry


1. We welcome the Committee’s inquiry into the work of the Electoral Commission and the

opportunity to submit evidence. Our response is limited to a specific issue regarding the

introduction of a statutory duty of care for online harm reduction, which we expect the

Government to bring forward in the Online Harms Bill early in the New Year, and how

the work of the Electoral Commission might fit into a wider regulatory approach that

includes the reduction of harms to democracy and electoral processes. We would be

happy to provide further information to the Committee if helpful.

About our work

2. The Carnegie UK Trust was set up in 1913 by Scottish-American philanthropist Andrew

CARNEGIE to improve the wellbeing of the people of the United Kingdom and Ireland. Our founding deed gave the Trust a mandate to reinterpret our broad mission over the passage

of time, to respond accordingly to the most pressing issues of the day and we have worked on digital policy issues for a number of years.


3. In early 2018, Professor Lorna Woods (Professor of Internet Law at the University of Essex) and former civil servant William Perrin started work to develop a model to reduce

online harms through a statutory duty of care, enforced by a regulator. The proposals were published in a series of blogs and ppublicationsFOR Carnegie and developed further in

evidence to Parliamentary Committees

1. The Carnegie April 2019 policy document

2 ‘Online harm reduction – a statutory duty of care and regulator’ discuss the arguments for

a systemic approach at length, building on a “precautionary principle” that places responsibility for the management and mitigation of the risk of harm - harms which they

have had a role in creating or exacerbating - on the tech companies themselves.


4. The Lords Communications Committee3 and the Commons Science and Technology

Committee4 both ENDORSED the Carnegie model, as have a number of civil society

organisations


5 . In April 2019, the government’s Online Harms White Paper6, produced

under the then Secretary of State for Digital, Culture, Media and Sport, Jeremy Wright,

1 Our work, including blogs, papers and submissions to Parliamentary Committees and consultations, can be

found here: https://www.carnegieuktrust.org.uk/project/harm-reduction-in-social-media/

2 See https://d1ssu070pg2v9i.cloudfront.net/pex/carnegie_uk_trust/2019/04/08091652/Online-harm-

reduction-a-statutory-duty-of-care-and-regulator.pdf

3 https://publications.parliament.uk/pa/ld201719/ldselect/ldcomuni/299/29902.htm

4 https://publications.parliament.uk/pa/cm201719/cmselect/cmsctech/822/82202.htm

5 For example, NSPCC: https://www.nspcc.org.uk/globalassets/ documents/news/taming-the-wild-west-web-

regulate-social-networks.pdf; Children’s Commissioner:

https://www.childrenscommissioner.gov.uk/2019/02/06/childrens-commissioner-publishes-astatutory-duty-

of-care-for-online-service-providers/; Royal Society for Public Health: https://www.rsph.org.uk/our-

work/policy/wellbeing/new-filters.html


And the pdf

 
 
 

Recent Posts

See All

Comments


© 2020 CAROLINE STEPHENS

  • Twitter
  • Caroline-Stephens_Icon
  • Facebook Social Icon
bottom of page