A Church of England Bishop has issued ten commandments to help protect humans from potential abuses from Artificial Intelligence.
The Bishop of Oxford Steven Croft who sits of the House of Lords select committee on AI, urged the Government and the technology industry to take the ethical implications of AI advancements seriously.
BYPASS THE CENSORS
Sign up to get unfiltered news delivered straight to your inbox.
RT reports: While AI is often hailed as the future of economic development, there are many who are more wary of the technology’s advance. Right Rev Steven Croft set out his ‘commandments’ to address such concerns at a policy debate in London last week.
Bishop Croft had become worried after hearing evidence of the technology’s power while sitting on the Lords committee on AI, previously warning that “every development in Artificial Intelligence raises new questions about what it means to be human.”
Speaking at the event ‘Artificial Intelligence and Robotics: Innovation, Funding and Policy Priorities’, the Bishop created his own Decalogue for AI. Croft’s commandments focused on data protection, ethics, criminal subversion and restrictions on AI’s power “to destroy.”
The commandments contain general advice such as “AI should be designed for all, and benefit humanity” as well as more political dictums; “The application of AI should be to reduce inequality of wealth, health, and opportunity” and AI should not “subvert the values of our democracy.”
Rather than be a driver of economic gain, the bishop believes AI should be “…directed toward the most urgent problems facing humanity.” The list contains an inevitable reference to a robot revolution, warning: “The autonomous power to hurt or destroy should never be vested in artificial intelligence.”
Latest posts by Niamh Harris (see all)
- Dr’s Assistant Fired After Reporting Adverse Reactions To The Covid Jabs To VAERS - March 22, 2023
- Arkansas Governor Signs Law Prohibiting Transgender Restrooms in Schools - March 22, 2023
- White House: Ukraine Ceasefire Is ‘Unacceptable’ - March 22, 2023