  {"version":"1.0","provider_name":"麻豆原创 Australia &amp; New Zealand News Center","provider_url":"https:\/\/news.sap.com\/australia","author_name":"Ian Ryan","author_url":"https:\/\/news.sap.com\/australia\/author\/ianryan\/","title":"Opinion: Building explainability into AI projects","type":"rich","width":600,"height":338,"html":"<blockquote class=\"wp-embedded-content\" data-secret=\"HdjPGaUK2O\"><a href=\"https:\/\/news.sap.com\/australia\/2021\/09\/14\/opinion-building-explainability-into-ai-projects\/\">Opinion: Building explainability into AI projects<\/a><\/blockquote><iframe sandbox=\"allow-scripts\" security=\"restricted\" src=\"https:\/\/news.sap.com\/australia\/2021\/09\/14\/opinion-building-explainability-into-ai-projects\/embed\/#?secret=HdjPGaUK2O\" width=\"600\" height=\"338\" title=\"&#8220;Opinion: Building explainability into AI projects&#8221; &#8212; 麻豆原创 Australia &amp; New Zealand News Center\" data-secret=\"HdjPGaUK2O\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"wp-embedded-content\"><\/iframe><script type=\"text\/javascript\">\n\/* <![CDATA[ *\/\n\/*! This file is auto-generated *\/\n!function(d,l){\"use strict\";l.querySelector&&d.addEventListener&&\"undefined\"!=typeof URL&&(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&&!\/[^a-zA-Z0-9]\/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret=\"'+t.secret+'\"]'),o=l.querySelectorAll('blockquote[data-secret=\"'+t.secret+'\"]'),c=new RegExp(\"^https?:$\",\"i\"),i=0;i<o.length;i++)o[i].style.display=\"none\";for(i=0;i<a.length;i++)s=a[i],e.source===s.contentWindow&&(s.removeAttribute(\"style\"),\"height\"===t.message?(1e3<(r=parseInt(t.value,10))?r=1e3:~~r<200&&(r=200),s.height=r):\"link\"===t.message&&(r=new URL(s.getAttribute(\"src\")),n=new URL(t.value),c.test(n.protocol))&&n.host===r.host&&l.activeElement===s&&(d.top.location.href=t.value))}},d.addEventListener(\"message\",d.wp.receiveEmbedMessage,!1),l.addEventListener(\"DOMContentLoaded\",function(){for(var e,t,s=l.querySelectorAll(\"iframe.wp-embedded-content\"),r=0;r<s.length;r++)(t=(e=s[r]).getAttribute(\"data-secret\"))||(t=Math.random().toString(36).substring(2,12),e.src+=\"#?secret=\"+t,e.setAttribute(\"data-secret\",t)),e.contentWindow.postMessage({message:\"ready\",secret:t},\"*\")},!1)))}(window,document);\n\/\/# sourceURL=https:\/\/news.sap.com\/australia\/wp-includes\/js\/wp-embed.min.js\n\/* ]]> *\/\n<\/script>\n","thumbnail_url":"https:\/\/news.sap.com\/australia\/files\/2020\/04\/282191_GettyImages-518142242-1_2600px_72dpi_qual8.jpg","thumbnail_width":2600,"thumbnail_height":1733,"description":"Accelerating medical research, increasing public safety, building smart cities and continually improving the services used by citizens every day are just a few examples of the benefits that artificial intelligence (AI) can deliver in the public sector,\u00a0writes Ian Ryan.Yet compared with many private sector industries, it\u2019s fair to say that public sector adoption of AI technology has been more measured. Governments and other public sector organisations face a number of significant challenges, from the availability of skills and investment funding, to demonstrating value and ensuring transparency about how decisions are made. \u00a0These challenges are reflected in the 麻豆原创 Institute for Digital Government\u2019s latest report \u2013\u00a0Building Explainability into Public Sector Artificial Intelligence\u00a0\u2013 developed in partnership with the University of Queensland. While 80 per cent of public sector organisations are actively working towards data-driven transformation, fewer than 15 per cent have progressed beyond prototypes."}