Presentation about the state of AI, policy-relevant AI research and evidence gaps that can be addressed with new data, methods and modelling approaches.
the influence of machine language and data science in the emerging worldijtsrd
The study describes the machine learning language with respect to big data sciences. The process of machine learning has evolved to have grown significantly to progress in information science. This progress has led to conquer different domains and are capable of solving myriad problems and upgrading the applicative properties. Hence, the present study is drafted to highlight the importance of machine learning process and language. Anitha. S "The Influence of Machine Language and Data Science in the Emerging World" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-5 , August 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31907.pdf Paper Url :https://www.ijtsrd.com/engineering/computer-engineering/31907/the-influence-of-machine-language-and-data-science-in-the-emerging-world/anitha-s
Strategic Management of S&T Information for Innovation PromotionRoberto C. S. Pacheco
The National Innovation System Players (and their needs). Where we came from (regarding ICT)
The ICT Goals: Information Modeling
Methodology and Technological Architecture. Presentation at euroCRIS Members Meeting. 2005 - November. Lisbon
Connecting Information to Strategic Needs
Outcomes: supporting public policies
connecting scientific communities. The Open road:
challenges; opportunities; a new kind of IT professional
A Comprehensive Overview of Advance Techniques, Applications and Challenges i...IRJTAE
— The field of data science uses scientific methods, algorithms, processes, and systems to extract
insights and knowledge from structured and unstructured data. It combines principles from mathematics,
statistics, computer science, and domain expertise to analyse, interpret, and present data in meaningful ways. Its
primary aim is to uncover patterns, trends, and correlations across various domains to aid in making informed
decisions, predictions, and optimizations. Data science encompasses data collection, cleaning, analysis,
interpretation, and communication of findings. Techniques such as machine learning, statistical analysis, data
mining, and data visualization are commonly employed to derive valuable insights and solve complex problems.
Data scientists use programming languages and tools to manage large volumes of data, transforming raw
information into actionable intelligence, driving innovation, and enabling evidence-based decision-making in
businesses, research, and various other applications. This review seeks to provide a valuable resource for
researchers, practitioners, and enthusiasts who wish to gain in-depth knowledge and understanding of data
science and its implications for the ever-evolving data-driven world.
the influence of machine language and data science in the emerging worldijtsrd
The study describes the machine learning language with respect to big data sciences. The process of machine learning has evolved to have grown significantly to progress in information science. This progress has led to conquer different domains and are capable of solving myriad problems and upgrading the applicative properties. Hence, the present study is drafted to highlight the importance of machine learning process and language. Anitha. S "The Influence of Machine Language and Data Science in the Emerging World" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-5 , August 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31907.pdf Paper Url :https://www.ijtsrd.com/engineering/computer-engineering/31907/the-influence-of-machine-language-and-data-science-in-the-emerging-world/anitha-s
Strategic Management of S&T Information for Innovation PromotionRoberto C. S. Pacheco
The National Innovation System Players (and their needs). Where we came from (regarding ICT)
The ICT Goals: Information Modeling
Methodology and Technological Architecture. Presentation at euroCRIS Members Meeting. 2005 - November. Lisbon
Connecting Information to Strategic Needs
Outcomes: supporting public policies
connecting scientific communities. The Open road:
challenges; opportunities; a new kind of IT professional
A Comprehensive Overview of Advance Techniques, Applications and Challenges i...IRJTAE
— The field of data science uses scientific methods, algorithms, processes, and systems to extract
insights and knowledge from structured and unstructured data. It combines principles from mathematics,
statistics, computer science, and domain expertise to analyse, interpret, and present data in meaningful ways. Its
primary aim is to uncover patterns, trends, and correlations across various domains to aid in making informed
decisions, predictions, and optimizations. Data science encompasses data collection, cleaning, analysis,
interpretation, and communication of findings. Techniques such as machine learning, statistical analysis, data
mining, and data visualization are commonly employed to derive valuable insights and solve complex problems.
Data scientists use programming languages and tools to manage large volumes of data, transforming raw
information into actionable intelligence, driving innovation, and enabling evidence-based decision-making in
businesses, research, and various other applications. This review seeks to provide a valuable resource for
researchers, practitioners, and enthusiasts who wish to gain in-depth knowledge and understanding of data
science and its implications for the ever-evolving data-driven world.
A look at the evolution of analytics and its revolutionary potential to transform ordinary businesses, power new business models, enable innovation, and deliver greater value. http://www2.deloitte.com/us/en/pages/deloitte-analytics/articles/analytics-trends.html
Knowledge Engineering, Electronic Government and the applications to Scientom...Roberto C. S. Pacheco
Presentation at 2nd International Meeting on Science, Technology and Innovation Indicators, organized by KAWAX, in Santiago - Chile (17 and january, Chile).
Re thinking regulation at the age of AILofred Madzou
This is a presentation of the keynote that Lofred Madzou (AI Project Lead at the World Economic Forum) gave on October 14th at the Instituto Nacional de Defensa de la Competencia y la Propiedad Intelectual (INDECOPI) in Lima. It presents some of the most important policy challenges associated with the development of means to address them.
Does the New Economy demand New Statistical Approaches?Ian Miles
Discusses how indicators and statistical systems may need revision in light of socioeconomic macrochanges in advanced countries, proposes scenario workshop approach to progressing the issue.
Artificial Intelligence: what value for intelligent machines?WeAreInnovation
The Innovation Index analyses the market attractiveness, business model maturity and infrastructure and support impact of a given sector or technology. It aims at analyzing Artificial Intelligence market attractiveness through facts, figures and key words analysis, as collected through WAI networks. It also evaluates infrastructure and support impact, and estimate business model maturity to come to a final index value. To know more, your can also browse our Global Knowledge Library.
The new fundamentals-Seizing opportunities with AI in the cognitive economyLynn Reyes
We are in a new era of exponential learning and the world is transitioning to a cognitive economy. All—organizations, industries, governments, individuals—are learning, interacting in dynamic ecosystems and augmenting intelligence at increasing scales. Disruptive forces are reshaping societies and economies; and the impact of technology is especially profound. Data, emerging technologies and cyber-turbulence will continue to fuel disruption into the future. Leaders will also need to become agile visionary doers. Government will play a critical role in establishing the foundation of a knowledge-based, learning society. New fundamentals are needed.
Big Data Courses In Mumbai at Asterix Solution is designed to scale up from single servers to thousands of machines, each offering local computation and storage. With the rate at which memory cost decreased the processing speed of data never increased and hence loading the large set of data is still a big headache and here comes Hadoop as the solution for it.
http://www.asterixsolution.com/big-data-hadoop-training-in-mumbai.html
PRODUCTIVITY OF AGILE TEAMS: AN EMPIRICAL EVALUATION OF FACTORS AND MONITORIN...Claudia Melo
Presenting my thesis during the National Thesis Contest in Computer Science - top 6 PhD Computer Science Thesis in Brasil/ 2013.
XXXIV Congresso da Sociedade Brasileira de Computação (CSBC 2014) - CTD.
Minne analytics presentation 2018 12 03 final compressedBonnie Holub
Monday was another great conference by MinneAnalytics! #MinneFRAMA was a great success with over 1,100 attendees at Science Museum of Minnesota. Alison Rempel Brown is a great host! A Teradata colleague told me that her post about my presentation "blew up" with hits and she got over 2K views, and 60+ likes. I'm proud to be a part of this great #datascience organization brining #machinelearning and #artificialintelligence #analytics to our #bigdata clients. If you want my slides, here they are.
A look at the evolution of analytics and its revolutionary potential to transform ordinary businesses, power new business models, enable innovation, and deliver greater value. http://www2.deloitte.com/us/en/pages/deloitte-analytics/articles/analytics-trends.html
Knowledge Engineering, Electronic Government and the applications to Scientom...Roberto C. S. Pacheco
Presentation at 2nd International Meeting on Science, Technology and Innovation Indicators, organized by KAWAX, in Santiago - Chile (17 and january, Chile).
Re thinking regulation at the age of AILofred Madzou
This is a presentation of the keynote that Lofred Madzou (AI Project Lead at the World Economic Forum) gave on October 14th at the Instituto Nacional de Defensa de la Competencia y la Propiedad Intelectual (INDECOPI) in Lima. It presents some of the most important policy challenges associated with the development of means to address them.
Does the New Economy demand New Statistical Approaches?Ian Miles
Discusses how indicators and statistical systems may need revision in light of socioeconomic macrochanges in advanced countries, proposes scenario workshop approach to progressing the issue.
Artificial Intelligence: what value for intelligent machines?WeAreInnovation
The Innovation Index analyses the market attractiveness, business model maturity and infrastructure and support impact of a given sector or technology. It aims at analyzing Artificial Intelligence market attractiveness through facts, figures and key words analysis, as collected through WAI networks. It also evaluates infrastructure and support impact, and estimate business model maturity to come to a final index value. To know more, your can also browse our Global Knowledge Library.
The new fundamentals-Seizing opportunities with AI in the cognitive economyLynn Reyes
We are in a new era of exponential learning and the world is transitioning to a cognitive economy. All—organizations, industries, governments, individuals—are learning, interacting in dynamic ecosystems and augmenting intelligence at increasing scales. Disruptive forces are reshaping societies and economies; and the impact of technology is especially profound. Data, emerging technologies and cyber-turbulence will continue to fuel disruption into the future. Leaders will also need to become agile visionary doers. Government will play a critical role in establishing the foundation of a knowledge-based, learning society. New fundamentals are needed.
Big Data Courses In Mumbai at Asterix Solution is designed to scale up from single servers to thousands of machines, each offering local computation and storage. With the rate at which memory cost decreased the processing speed of data never increased and hence loading the large set of data is still a big headache and here comes Hadoop as the solution for it.
http://www.asterixsolution.com/big-data-hadoop-training-in-mumbai.html
PRODUCTIVITY OF AGILE TEAMS: AN EMPIRICAL EVALUATION OF FACTORS AND MONITORIN...Claudia Melo
Presenting my thesis during the National Thesis Contest in Computer Science - top 6 PhD Computer Science Thesis in Brasil/ 2013.
XXXIV Congresso da Sociedade Brasileira de Computação (CSBC 2014) - CTD.
Minne analytics presentation 2018 12 03 final compressedBonnie Holub
Monday was another great conference by MinneAnalytics! #MinneFRAMA was a great success with over 1,100 attendees at Science Museum of Minnesota. Alison Rempel Brown is a great host! A Teradata colleague told me that her post about my presentation "blew up" with hits and she got over 2K views, and 60+ likes. I'm proud to be a part of this great #datascience organization brining #machinelearning and #artificialintelligence #analytics to our #bigdata clients. If you want my slides, here they are.
Presentation about new indicators for innovation missions focusing on the mission to transform the prevention, diagnosis and treatment of AI, given at the EMAEE conference, University of Sussex 5 June 2019.
Deep Learning Deep Change: Mapping the evolution of the Artificial Intelligen...Juan Mateos-Garcia
Presentation of a working paper that uses data science methods to analyse the geography and drivers of Deep Learning Research.
https://arxiv.org/abs/1808.06355
Established fields and industries tend to cluster geographically, and the question of to what extent these can be disrupted has a long history of research.
We take a modern look at this question, by analysing the effect of the emergence of the deep-learning paradigm in the broader domain of machine learning research.
This presentation outlines the drivers of an algorithmic economy, the people who work on it and its challenges. I pay special attention to the risks of algorithmic error and safety, and the human supervisors in charge of managing it.
To Err is Algorithm: Algorithmic Fallibility and Economic OrganisationJuan Mateos-Garcia
Slides for paper presented at Data for Policy Conference September 2017.
Abstract
Algorithmic decision-making systems based on artificial intelligence and machine learning are enabling unprecedented levels of personalisation, recommendation and matching. Unfortunately, these systems are fallible, and their failures have costs. I develop a formal model of algorithmic decision-making and its supervision to explore the trade- offs between more (algorithm-facilitated) beneficial deci- sions and more (algorithm-caused) costly errors. The model highlights the importance of algorithm accuracy and human supervision in high-stakes environments where the costs of error are high, and shows how decreasing returns to scale in algorithmic accuracy, increasing incentives to ’game’ popular algorithms, and cost inflation in human supervision might constrain optimal levels of algorithmic decision-making.
Presentation about new data, methods and outputs to create knowledge for innovation policy. Presented at the OECD Blue Sky Conference, 20 September 2016.
Digital econ policy data presentation for readie 18mar2016Juan Mateos-Garcia
New types of data - big data, open data, mashed up data - can help us understand the digital economy in new, policy-relevant ways. Here are some examples and challenges.
A presentation of the approach and findings in our games mapping report, where we used big data sources in order to measure and map the UK video games industry.
The profile of the management (data) scientist: Potential scenarios and skill...Juan Mateos-Garcia
Big and Social Media data opens up new scenarios and opportunities for management research (such as using internal communication data to map knowledge networks inside firms, or using web data to study firm capabilities and strategies). This presentation, given at the British Academy of Management 2014 conference proposes a typology of such scenarios, describes the skills required to exploit them, and considers implications for the education and training of management researchers.
Preliminary findings of the Brighton Fuse Firm Survey. Descriptive, using a subset of the sample (n=104). Caution in the interpretation of the (intriguing) findings is advised.
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
2. About today
Goal
Identify some avenues of research that might yield evidence to inform AI policy.
Hopefully we can pursue some of them together!
Structure
1. Problem definition
2. State of the world
3. State of knowledge
4. New directions
3. 1. Problem definition
Artificial Intelligence
“The designing and building of intelligent agents that receive percepts from the environment and take
actions that affect that environment.”
AI systems are deployed in a way that enhances societal wellbeing
Technology development Economic organisation
Values Policy
?
4. 1. Problem definition: A question of control
S1d
S1a
δ1
S2d
S2a
δ2
P = f(δ,K1
)
t
Policies P informed by our understanding of the state
of the world δ(S1d
,S1a
) and our state of knowledge K
about what to do.
Question: Is KAI
==KIT
?
K1
K2
5. 2. State of the world: Technology development
● We are moving into a
large model era
● Model performance
scales with size… but
not for all tasks
● Important
breakthroughs in AI
for scientific R&D
Sevilla et al 2022
6. 2. State of the world: Economic organisation
● Industrialisation of AI: private sector
generating the most significant
advances
● Emergence of the foundation model
pipeline / business model
● Geopolitical fragmentation of AI
R&D
Ganguli et al 2022
7. 2. State of the world: Values
● LLMs create
significant ethical
risks
● This has a
counterfactual
aspect: values
missing from paths
not explored
● Proliferation of
ethical principles and
guidelines
Weidinger et al 2022
8. 2. State of the world: Policy
● Continued public investment in
AI R&D
● Increased regulatory pressure on
risky AI systems (EU AI act)
○ Including bans e.g. facial
recognition
● Increased attention being given
to implementation and
experimentation
9. 2. State of the world: Summary
AI is evolving towards larger models developed by a small
number of mostly private actors with access to big data and
compute.
Model performance improves with size in many but not all
dimensions, and their deployment raises important ethical
risks.
Policymakers are becoming more assertive about acceptable
/ unacceptable AI applications while continuing to support its
development.
10. 2. State of the world: Underlying patterns
■ Open science / open source model
■ Metric driven innovation speeds pace but induces racing &
gaming
■ Hyper-accelerated technical evolution
■ Architectural / systemic innovation
○ Combinatorial explosion of implementation possibilities: more directionality
■ Hidden and surprising failure modes
■ Highly scalable in a narrow but profitable set of use-cases
11. 3. State of knowledge: The AI “production function”
We can use this to tell stories: “Corporate research teams in the USA and China have leveraged big
private and open datasets and compute to build large language models that are now powering internet
services creating big economic impacts and raising ethical risks”
But we don’t really understand how this new, complex, fast-changing system works
Funding
Research
Code / tech
Models
Data
Talent
Data
Organisations, institutions, ecosystems, polities
Impacts
Compute
Products
Services
Platforms
Companies
Research Code / tech
Models
Data
12. 3. State of knowledge: Some gaps
Funding
Research
Code / tech
Models
Data
Talent
Data
Organisations, institutions, ecosystems, polities
Impacts
Compute
Products
Services
Platforms
Companies
Research Code / tech
Models
Data
How does this open (but uneven)
infrastructure shape development and
diffusion?
What is the link between
diversity in talent and
diversity in ideas
pursued?
Does private sector leadership skew the
technologies that are developed and their
impacts? What about regional and
national angles?
What are the hidden impacts
of the technologies that are
developed and how do they
compare with counterfactual
paths?
13. Economics of AI
(mainstream)
AI governance
AI progress
forecasting
AI metascience
Critical studies
of AI
AI safety
AI ethics
Jobs, income share and
productivity
Inputs and outputs
Scientometrics
Technical failure
modes and risks
Institutions
Power and
justice
Principles and
guidelines
This is partly a consequence of
knowledge fragmentation
Could a complex economics of AI
help to bridge some of these gaps?
14. Constructionist
Quantitative / multi-site
Economics of AI
(mainstream)
Critical studies of
AI
AI progress
forecasting
AI safety AI governance
Deterministic
AI
metascience
Complex
economics of AI
AI ethics
One technological trajectory; AI will be adopted and the role of policy is to accelerate / prepare / adapt
Multiple technological trajectories; AI might not be adopted. The role of policy is to steer / balance and even block
Quantitative / single-site
15. 3. State of knowledge: Some comments on the Econ of AI
■ Model AI progress as a scalar
○ Looking at patents, survey data, some online job ads
■ Strong focus on automation
○ So far ignoring one of the main “exposed” labour markets: Scientific
R&D
■ Some attention to complementarities in organisation, less so in
production
■ Technologically deterministic
○ With some exceptions: Acemoglu and Korinek
16. 3. State of knowledge: Knowledge flows into CEoAI
Economics of AI (mainstream)
AI governance
AI progress forecasting
AI metascience
Critical studies of AI
AI safety
AI ethics
Considering incentives, organisation, economic impacts
Taking into account all inputs / outputs into the AI production function
Paying special attention to R&D dynamics & the impact of AI on science
Not assuming that “AI will work”
Modelling the relationship between organisations and institutions
Acknowledging a broad range of values
Studying the social and cultural determinants of AI’s trajectory
17. 3. State of knowledge: CEoAI requires…
New data sources capturing
important dimensions of AI R&D
New analytics to find AI and
measure its ecosystems,
composition and trajectory
18. 4. New directions: Some examples (my work)
A Narrowing of AI
research (2020)
The Privatisation of AI
research(ers) (2021)
Deep Learning, Deep
Change? (2018)
Gender diversity in
AI research (2019)
Funding
Research
Code / tech
Models
Data
Talent
Data
Organisations, institutions, ecosystems, polities
Impacts
Compute
Products
Services
Platforms
Companies
Research Code / tech
Models
Data
AI and the fight against
Covid-19 (2020)
Mapping
innovation
missions
(2019)
19. 4. New directions: A Narrowing of AI research?
Is AI research becoming thematically
narrower?
■ Analysis of AI preprint corpus enriched
with institutional information
■ Use of topic modelling to characterise
diversity
■ Evidence of stagnation in thematic
diversity
■ Private companies less diverse (and
more influential)
20. 4. New directions: The privatisation of AI research(ers)
Is there a brain drain of AI researchers into
industry? What are its drivers and impacts?
■ Analysis of AI articles enriched with
institutional information
■ Survival analysis of researcher transition
from academia to industry
■ Industry hiring influential male
researchers with expertise in deep
learning
■ Ambivalent link between transitioning
and influence: short-term benefits offset
by decline in citation levels over time
21. 4. New directions: More examples
Factors driving advances in AI
benchmarks (Martinez-Plumed
et al, 2021)
Evolution in the use of data in ML research
(Koch et al, 2021)
Impact of public
funding on AI’s
trajectory (Iori et al,
2021)
Interactions between national and corporate
innovation system (Åke-Lundvall and Rik,2022)
Funding
Research
Code / tech
Models
Data
Talent
Research Code / tech
Data
Organisations, institutions, ecosystems, polities
Impacts
Compute
Models
Products
Processes
Platforms
Companies
Data
Interaction between AI
& hardware (Hooker,
2020, Pryktova et al,
2021)
Organisational
implications of AI
(Bresnahan, 2019)
Historical role of
power in shaping AI’s
trajectory (Mohamed
et al 2020))
22. 4. New directions: Future possibilities [code]
What can we learn about AI’s structure and
trajectory from its open code ecosystem?
■ Papers with Code matches AI research with
benchmark performance and open source code
■ GitHub has an open API to collect data in real time
Example questions:
■ Why are private companies so active in the open
source space?
■ Is research fragmenting along the lines of open
source frameworks?
■ Can we use open source data to measure diffusion?
23. Could we use GitHub data to capture the network structure of AI software and its evolution?
Valverde and Solé (2015)
24. 4. New directions: Future possibilities [Trajectories]
What institutional factors are narrowing AI’s
trajectory and what are its impacts?
■ We know what papers have been accepted and
rejected in prestigious conferences like NeurIPS.
■ We know the tasks (benchmarks) that are tackled by
different papers (from PwC)
Example questions
■ Is the review process narrowing AI’s trajectory?
What about publication races?
■ What is the value of non-mainstream trajectories in
e.g. tackling different problems / injecting novelty
into the mainstream?
25. 4. New directions: Future possibilities [Impacts]
Do technical limitations skew AI applications
towards less societally beneficial areas?
■ We can use research / funding metadata and
semantic analysis to study where AI is being
applied (and by whom) in specific domains e.g.
health
Example questions
■ Are AI applications skewed towards data-rich
domains?
■ What is the role of non-mainstream techniques in
diversifying AI’s focus?
■ Who generates / funds what applications?
26. Who is funding AI projects to tackle chronic diseases with different topic compositions?
Mateos-Garcia (2019)
27. 4. New directions: Future possibilities [Impacts]
What is the impact of AI in scientific labour markets?
■ Deep learning is set out to transform / already
transforming scientific R&D in areas such as structural
biology, genomics, materials science.
■ Availability of open source software is playing an
important role
Example questions
■ Can we use scientometric methods to build skills maps
in research fields and quantify their exposure to AI?
■ How are researchers moving through these maps in
response to exposure? What are the risks?
28. Can we build maps like this about exposure to automation in e.g. structural biology?
Sleeman et al (2020))
29. Conclusions
■ There are big gaps in our understanding of AI’s trajectory that could hinder
policy
■ These are linked to disciplinary silos. In particular, mainstream economics of
AI is neglecting some important questions in the field.
■ A complex economics of AI could help bridge this gap
■ This will require working with new data sources and (data science) methods
■ There are many interesting questions that we could explore
■ We need institutional innovations to produce timely, policy-relevant evidence
in this fast-moving domain 🏃🏃🏃🏃🏃