人妻少妇无码不卡,91狼人社,亚洲日韩中文第一精品,老色鬼久久AV综合亚洲健身,五月天堂日本影院,欧美一区二区三区久久综合,91免费看 日韩一区二区,天天综合影院,51日日夜夜精品视频,日日夜夜精品视频天天7799 ,日韩99精品综合一二三区,久揄揄鲁精品一区二区,天天综合7799,99久久精品无码一区二区毛片免费,17CC网黑料爆料一区二区三区,日韩乱码精品字幕一区,三级网站国产精品一区二区三区,亚洲一区二区三区不卡视频

銷售熱線:198-5307-5821
  技術(shù)支持
您當(dāng)前所在位置:首頁 > 技術(shù)支持

Why is it important for data scientists to seek transparency?


發(fā)布時(shí)間:2023-07-16 17:42:21 來源: http://www.soulwars.cn/

摘要:Transparency is essentially important in data science projects and machine learning programs, partly because of the complexity and sophistication that drives th

Transparency is essentially important in data science projects and machine learning programs, partly because of the complexity and sophistication that drives them — because these programs are “learning” (generating probabilistic results) rather than following predetermined linear programming instructions, and because as a result, it can be hard to understand how the technology is reaching conclusions. The “black box” problem of machine learning algorithms that are not fully explainable to human decision-makers is a big one in this field.

With that in mind, being able to master explainable machine learning or “explainable AI” will likely be a main focus in how companies pursue talent acquisition for a data scientist. Already DARPA, the institution that brought us the internet, is funding a multimillion-dollar study in explainable AI, trying to promote the skills and resources needed to create machine learning and artificial intelligence technologies that are transparent to humans.

One way to think about it is that there is often a “literacy stage” of talent development and a “hyperliteracy stage.” For a data scientist, the traditional literacy stage would be knowledge of how to put together machine learning programs and how to build algorithms with languages like Python; how to construct neural networks and work with them. The hyperliteracy stage would be the ability to master explainable AI, to provide transparency in the use of machine learning algorithms and to preserve transparency as these programs work toward their goals and the goals of their handlers.

Another way to explain the importance of transparency in data science is that the data sets that are being used keep becoming more sophisticated, and therefore more potentially intrusive into people’s lives. Another major driver of explainable machine learning and data science is the European General Data Protection Regulation that was recently implemented to try to curb unethical use of personal data. Using the GDPR as a test case, experts can see how the need to explain data science projects fits into privacy and security concerns, as well as business ethics.


    上一篇我們送上的文章是 有限狀態(tài)機(jī)如何應(yīng)用于人工智能? , _!在下一篇繼續(xù)做詳細(xì)介紹,如需了解更多,請持續(xù)關(guān)注。
本文由日本NEC鋰電池中國營銷中心于2023-07-16 17:42:21 整理發(fā)布。
轉(zhuǎn)載請注明出處.
上一篇: 有限狀態(tài)機(jī)如何應(yīng)用于人工智能?
下一篇: Why is it important for data scientists to seek transparency?
最新資訊
相關(guān)信息
日本NEC鋰電池
聯(lián)系我們
地址:北京市朝陽區(qū)東方東路88號(hào)辦公樓F座8-9層
聯(lián)系人:余工
手機(jī):198-5307-5821