This document discusses problems and solutions related to open data for the Marseille-Provence 2013 European Capital of Culture project. Key issues addressed include overcoming resistance to open data by stakeholders, establishing an architecture to avoid centralization and maintain control of data by partners, allowing third parties to profit from public data aggregation while preventing raw data sales, and focusing on static rather than real-time dynamic data to avoid liability concerns from data providers. The project aims to attract 10 million visitors through improved digital services enabled by open data.
Tron: Legacy was a large-scale film from 2010 that used CGI in innovative ways compared to typical films. Specifically, it used CGI to create one character that interacted with real-life actors on set, rather than using full CGI for the entire film like in Happy Feet. This allowed them to blend practical and virtual elements in a new way for the film.
Analizzare i principi cardine che guidano lo Shopper Journey verso l'acquisto del prodotto e della propria categoria, è vitale per una comprensione delle scelte d'acquisto del nostro Shopper, così come per attuare interventi efficaci di Shopper Marketing che influenzino i comportamenti e permettano la crescita del business. Bisogna quindi focalizzarsi sul perchè e per quale occasione si acquista, sulle diverse attitudini ed esigenze dello Shopper allo scaffale/momento dell'acquisto e sulla capacità di Brand e Retailers di rimuovere le barriere e facilitare gli acquisti. E' infatti ampiamente dimostrato che esista una correlazione diretta fra la velocità di spesa dello Shopper e lo scontrino.
Ana Tudor "Liviu Rebreanu" School in Mioveni, Romania was awarded the European Quality Label for their project "Let's discover a better world" on October 6, 2009. The award was signed by Marc Durando from the Central Support Service and Simona Velea from the National Support Service of Romania.
Rules of engagement for Trends in Kids en Jongerenmarketing 2010Polle de Maagt
A plea for more stuff worth sharing. About the fact that internet is a conversation network, that brands should exceed expectations, that we all love little acts of kindness and that communication is a mere spotlight on remarkable acts.
Assignment #6 Research A Double Page Spreadmedia_jojo
The double page spread layout is effective and eye-catching. It uses large dominant images of the person being interviewed. Important quotes are in a bigger font to draw the eye. The organization of the information with headings and the capital letter starting the interview make it clear to read. Photos are manipulated with lighting and cropping to highlight key details.
People are more likely to do business with those they know, like, and trust. In the past, companies focused on gaining attention through branding to be liked, but in today's world with Internet access, customers have more information and choice. Companies now need to build genuine relationships with customers, provide meaningful services, and adapt to changes in order to survive. Authenticity and prioritizing customers are key to success in the new business environment.
The document summarizes a debriefing meeting about integrating information and communication technologies (ICT) into an existing drought early warning system (DEWS) in Ethiopia. Key findings show that while DEWS has been successful in information sharing, ICT opportunities are feasible with a strategic plan that builds on existing strengths and takes risks using participatory approaches. Recommendations include keeping the ICT approach simple, participatory, focused on the future while leveraging existing communication pathways and stakeholder relationships. Maps presented show current DEWS information flows and potential opportunities to incorporate SMS, internet, databases to improve data collection, reporting, and decision making.
Tron: Legacy was a large-scale film from 2010 that used CGI in innovative ways compared to typical films. Specifically, it used CGI to create one character that interacted with real-life actors on set, rather than using full CGI for the entire film like in Happy Feet. This allowed them to blend practical and virtual elements in a new way for the film.
Analizzare i principi cardine che guidano lo Shopper Journey verso l'acquisto del prodotto e della propria categoria, è vitale per una comprensione delle scelte d'acquisto del nostro Shopper, così come per attuare interventi efficaci di Shopper Marketing che influenzino i comportamenti e permettano la crescita del business. Bisogna quindi focalizzarsi sul perchè e per quale occasione si acquista, sulle diverse attitudini ed esigenze dello Shopper allo scaffale/momento dell'acquisto e sulla capacità di Brand e Retailers di rimuovere le barriere e facilitare gli acquisti. E' infatti ampiamente dimostrato che esista una correlazione diretta fra la velocità di spesa dello Shopper e lo scontrino.
Ana Tudor "Liviu Rebreanu" School in Mioveni, Romania was awarded the European Quality Label for their project "Let's discover a better world" on October 6, 2009. The award was signed by Marc Durando from the Central Support Service and Simona Velea from the National Support Service of Romania.
Rules of engagement for Trends in Kids en Jongerenmarketing 2010Polle de Maagt
A plea for more stuff worth sharing. About the fact that internet is a conversation network, that brands should exceed expectations, that we all love little acts of kindness and that communication is a mere spotlight on remarkable acts.
Assignment #6 Research A Double Page Spreadmedia_jojo
The double page spread layout is effective and eye-catching. It uses large dominant images of the person being interviewed. Important quotes are in a bigger font to draw the eye. The organization of the information with headings and the capital letter starting the interview make it clear to read. Photos are manipulated with lighting and cropping to highlight key details.
People are more likely to do business with those they know, like, and trust. In the past, companies focused on gaining attention through branding to be liked, but in today's world with Internet access, customers have more information and choice. Companies now need to build genuine relationships with customers, provide meaningful services, and adapt to changes in order to survive. Authenticity and prioritizing customers are key to success in the new business environment.
The document summarizes a debriefing meeting about integrating information and communication technologies (ICT) into an existing drought early warning system (DEWS) in Ethiopia. Key findings show that while DEWS has been successful in information sharing, ICT opportunities are feasible with a strategic plan that builds on existing strengths and takes risks using participatory approaches. Recommendations include keeping the ICT approach simple, participatory, focused on the future while leveraging existing communication pathways and stakeholder relationships. Maps presented show current DEWS information flows and potential opportunities to incorporate SMS, internet, databases to improve data collection, reporting, and decision making.
Sahana General 2009 Community And SystemTalkSahana
The document summarizes the evolution of the Sahana system and community from its origins responding to the 2004 Indian Ocean tsunami to its development into a global open source disaster management platform. It describes how Sahana was initially built hastily during the tsunami crisis, then redesigned as a modular open source system to address common disaster problems and attract broader participation. It outlines key Sahana applications and how the system and community have continued to respond to new disaster needs and official government deployments around the world.
Mobile collaboration tools can help improve disaster response efforts. The document discusses challenges with collaboration during crises like slow networks and lack of information sharing. It introduces four free and open-source mobile tools created to help address gaps: Mesh4X for data synchronization, GeoChat for location-based messaging, and mobile forms for collecting field data. The tools aim to facilitate real-time information exchange, analysis, and coordinated response during emergencies. The document also reflects on design principles for collaboration tools and ensuring broad adoption beyond just crisis scenarios.
Multi Discipline Intelligence Production Teams 1DataTactics
The document discusses the need for multi-disciplinary intelligence production teams to help address challenges posed by increasing data volumes and proposes integrating experts from different fields like IT, software, statistics and intelligence to work together on tackling complex problems. It provides examples of how such integrated teams could support mission requirements by developing new processes, data products, tools and visualizations to gain actionable insights from large and diverse datasets. The document also outlines some accomplishments of integrated data analysis teams in supporting organizations like DEA and DoD with detecting illicit activity and identifying unknown threats.
RDC - Benoit Pierenne: Data Interoperability ICASRAI
This document discusses challenges and solutions around research data management in Canada. It argues that the real challenges are curating metadata and ensuring long-term access to data, rather than hardware storage. It proposes establishing data stewardship facilities that can provide long-term storage, access, and curation of research data from multiple related projects in a cost-effective way. Such facilities could help address issues around scattered and inaccessible data by acting as a central portal and ensuring data stewardship beyond individual projects. Examples of existing Canadian data stewardship facilities in astronomy, polar research, and social/health statistics are provided.
Internationalisation Of Digital Media CompaniesTommi Pelkonen
This document summarizes a presentation on the internationalization of the Finnish digital media industry. It discusses the background and objectives of the study, which analyzed the patterns of internationalization in the industry at both the company and industry levels. It provides an overview of the theoretical approaches taken and defines what is considered the digital media industry. Key areas analyzed include the internationalization strategies and networks of digital media companies in Finland. The presentation is based on surveys, interviews and case studies of companies in the Finnish digital media sector.
De la ePrescription la eHealth – strategia eHealth Slovacia-ehealth-8iulie2010Agora Group
The document discusses Slovakia's national ePrescription approach and eHealth strategy. It outlines the milestones in developing the strategy, including establishing stakeholder requirements and feasibility studies. The strategy involves implementing ePrescription and other priority projects in waves from 2010-2011 and beyond. Key elements are developing a national health portal, electronic patient records, and supporting efficient prescribing and medication management while improving health outcomes and reducing costs.
The document discusses DataONE, a project aimed at improving data repository interoperability and advancing best practices in data lifecycle management. It focuses on enabling access to multiple external data repositories from within a HUB environment. This would allow users to aggregate and integrate disparate datasets for new analyses, and enable reproducible workflows. The goal is to address issues around scattered and dispersed data by improving discovery, integration and long-term preservation of datasets.
The document discusses key topics related to big data including its definition, characteristics, sources, storage, analytics applications, risks, and tools. It also covers data science, the role of data scientists, and challenges in working with big data. Big data is defined as large volumes of diverse data that are difficult to process using traditional methods due to size and complexity. Common sources include scientific instruments, mobile devices, social media, and sensors. Storing and analyzing big data requires distributed and scalable tools and techniques.
This document summarizes research into business models for open data in Dutch public institutions. It identifies two main business models - incremental, which fits into existing strategies, and radical, which enables brand new strategies. Case studies of seven public organizations are analyzed. In general, organizations do not charge for raw open data but may consider fees for excessive usage. They also want to maintain their own presentation services for user feedback. Most organizations expect open data to provide financial benefits.
The document discusses Canada's Multi-Agency Situational Awareness System (MASAS), which aims to simplify information sharing between emergency response agencies by providing a single system to share incident information in real-time, rather than through separate communication channels. MASAS uses an open architecture and standards to allow various response tools and agencies to share information on a common operating picture in order to improve coordination and response times. The system has gained recognition in Canada as a national priority for public safety and has expanded to include over 225 agencies and organizations.
Thailand's disaster information systems are not well coordinated and do not effectively support decision makers. Government data and information is often stored in documents like Word files that are difficult to access and analyze computationally. There is a need for Thailand to create a single, open disaster data system that consolidates information from different ministries and makes the data available via open standards and APIs to support uses like mobile apps, analysis tools, and crowdsourcing systems. This could help improve coordination, access to information, and disaster response efforts in Thailand.
Presentation given by IFAC Executive Director, Governance and Operations, Alta Prinsloo at the South African Institute of Professional Accountants National Conference. Presentation details the current status of integrated reporting globally and in Africa and how the issue is playing out for small- and medium-sized entities (SMEs).
A short introduction to GEO governance, the GEO Work Programme and the GEO community for the FOSS4G audience. Contributions on GEOGLOWS, eShape and GEOHack19 from Julia Wagemann, Valentina Balcan and Diana Mastracci.
This document summarizes a checklist for assessing the readiness of a spatial data infrastructure (SDI). It covers key components such as understanding spatial data holdings and requirements, developing an SDI vision and strategy, policy readiness, and collaboration. The checklist contains questions in each area to help evaluate an SDI's maturity and guide its further implementation, focusing on issues like formal information audits, stakeholder engagement, performance indicators, costs, policy details, and cross-border data sharing.
Open Government Data - Security Risk or mean for Threat PreventionJohann Höchtl
This document discusses open government data and its associated security risks and future prospects. It begins by outlining the political mindset of transparency that drives open data initiatives. It then defines open government data and provides examples. Reasons for open data include increased transparency, efficiency and trust in government. However, publicly releasing certain data sets could enable threats like targeting critical infrastructure or planning attacks. Future research is needed to assess security risks and how to select and release data to maximize benefits while mitigating risks. The document concludes by discussing a model for evaluating open data initiatives based on their overall security and welfare impacts.
Stream reasoning: mastering the velocity and the variety dimensions of Big Da...Emanuele Della Valle
More and more applications require real-time processing of heterogeneous data streams. In terms of the “Vs” of Big Data (volume, velocity, variety and veracity), they require addressing velocity and variety at the same time. Big Data solutions able to handle separately velocity and variety have been around for a while, but only Stream Reasoning approaches those two dimensions at once. Current results in the Stream Reasoning field are relevant for application areas that require to: handle massive datasets, process data streams on the fly, cope with heterogeneous incomplete and noisy data, provide reactive answers, support fine-grained information access, and integrate complex domain models. This talk starting from those requirements, frames the problem addressed by Stream Reasoning. It poses the research question and operationalise it with four simpler sub-questions. It describes how the database group of Politecnico di Milano positively answered those sub-questions in the last 7 years of research. It briefly surveys alternative approaches investigated by other research groups world wide and it elaborates on current limitations and open challenges.
Facilitating a Digital Commons with Free and Open Source Software: Paving the...Sameer Verma
This document summarizes a presentation given by Sameer Verma on facilitating a digital commons using free and open source software (FOSS). It introduces San Francisco State University and provides an overview of FOSS, including definitions of free software and open source software. It discusses FOSS licensing and provides examples of popular FOSS applications and projects. It also addresses FOSS adoption policies in Southeast Asian countries and the use of FOSS for educational purposes.
not the same. Wisdom is
1) Organizations have lost millions due to poor data management practices but remain unaware of the root causes. the ability to apply one's
2) Unless the costs of poor data management are quantified, gaining approval for basic investments will be difficult. knowledge and experience
3) The talk illustrates how to identify specific costs of poor practices in HR, finance, supply chain, and compliance with good judgment.
to show data management as the root cause of problems and gain support for required investments.
Statbel is the national statistical institute of Belgium and a member of the European Statistical System. Big data presents new opportunities and challenges for official statistics. Statbel has established a Big Data Team and conducted projects using mobile phone data, web scraping, and satellite imagery. Key challenges include gaining access to proprietary data sources and developing statistical methodology for large, unstructured datasets. Smart statistics that integrate multiple data sources in real time could provide detailed monitoring systems for issues like air quality and population estimates. Statbel aims to further develop use cases for big data and may need to collaborate with data scientists or hire them internally to fully leverage big data.
The document discusses the Open Government Licence, noting that it allows for both commercial and non-commercial free use of public sector information with simple conditions of attribution and prohibiting misuse. It also notes that the licence is machine-readable and interoperable with other models like Creative Commons licenses. The document raises the benefits of machine-readable licenses for efficient information transfer and having a shared vocabulary, while also acknowledging there are still challenges to address and opportunities for the future of open licensing.
This document discusses privacy by design (PbD). PbD requires building privacy protections directly into systems and practices through principles like data protection by default and design. It involves implementing privacy-enhancing technologies and tools to empower users. One example is an identity protector that uses pseudonymization. PbD faces challenges in implementation due to lack of economic incentives and legacy systems. The document recommends a toolbox approach to PbD, with privacy impact assessments and patterns, as well as certification and standards to facilitate adoption.
Sahana General 2009 Community And SystemTalkSahana
The document summarizes the evolution of the Sahana system and community from its origins responding to the 2004 Indian Ocean tsunami to its development into a global open source disaster management platform. It describes how Sahana was initially built hastily during the tsunami crisis, then redesigned as a modular open source system to address common disaster problems and attract broader participation. It outlines key Sahana applications and how the system and community have continued to respond to new disaster needs and official government deployments around the world.
Mobile collaboration tools can help improve disaster response efforts. The document discusses challenges with collaboration during crises like slow networks and lack of information sharing. It introduces four free and open-source mobile tools created to help address gaps: Mesh4X for data synchronization, GeoChat for location-based messaging, and mobile forms for collecting field data. The tools aim to facilitate real-time information exchange, analysis, and coordinated response during emergencies. The document also reflects on design principles for collaboration tools and ensuring broad adoption beyond just crisis scenarios.
Multi Discipline Intelligence Production Teams 1DataTactics
The document discusses the need for multi-disciplinary intelligence production teams to help address challenges posed by increasing data volumes and proposes integrating experts from different fields like IT, software, statistics and intelligence to work together on tackling complex problems. It provides examples of how such integrated teams could support mission requirements by developing new processes, data products, tools and visualizations to gain actionable insights from large and diverse datasets. The document also outlines some accomplishments of integrated data analysis teams in supporting organizations like DEA and DoD with detecting illicit activity and identifying unknown threats.
RDC - Benoit Pierenne: Data Interoperability ICASRAI
This document discusses challenges and solutions around research data management in Canada. It argues that the real challenges are curating metadata and ensuring long-term access to data, rather than hardware storage. It proposes establishing data stewardship facilities that can provide long-term storage, access, and curation of research data from multiple related projects in a cost-effective way. Such facilities could help address issues around scattered and inaccessible data by acting as a central portal and ensuring data stewardship beyond individual projects. Examples of existing Canadian data stewardship facilities in astronomy, polar research, and social/health statistics are provided.
Internationalisation Of Digital Media CompaniesTommi Pelkonen
This document summarizes a presentation on the internationalization of the Finnish digital media industry. It discusses the background and objectives of the study, which analyzed the patterns of internationalization in the industry at both the company and industry levels. It provides an overview of the theoretical approaches taken and defines what is considered the digital media industry. Key areas analyzed include the internationalization strategies and networks of digital media companies in Finland. The presentation is based on surveys, interviews and case studies of companies in the Finnish digital media sector.
De la ePrescription la eHealth – strategia eHealth Slovacia-ehealth-8iulie2010Agora Group
The document discusses Slovakia's national ePrescription approach and eHealth strategy. It outlines the milestones in developing the strategy, including establishing stakeholder requirements and feasibility studies. The strategy involves implementing ePrescription and other priority projects in waves from 2010-2011 and beyond. Key elements are developing a national health portal, electronic patient records, and supporting efficient prescribing and medication management while improving health outcomes and reducing costs.
The document discusses DataONE, a project aimed at improving data repository interoperability and advancing best practices in data lifecycle management. It focuses on enabling access to multiple external data repositories from within a HUB environment. This would allow users to aggregate and integrate disparate datasets for new analyses, and enable reproducible workflows. The goal is to address issues around scattered and dispersed data by improving discovery, integration and long-term preservation of datasets.
The document discusses key topics related to big data including its definition, characteristics, sources, storage, analytics applications, risks, and tools. It also covers data science, the role of data scientists, and challenges in working with big data. Big data is defined as large volumes of diverse data that are difficult to process using traditional methods due to size and complexity. Common sources include scientific instruments, mobile devices, social media, and sensors. Storing and analyzing big data requires distributed and scalable tools and techniques.
This document summarizes research into business models for open data in Dutch public institutions. It identifies two main business models - incremental, which fits into existing strategies, and radical, which enables brand new strategies. Case studies of seven public organizations are analyzed. In general, organizations do not charge for raw open data but may consider fees for excessive usage. They also want to maintain their own presentation services for user feedback. Most organizations expect open data to provide financial benefits.
The document discusses Canada's Multi-Agency Situational Awareness System (MASAS), which aims to simplify information sharing between emergency response agencies by providing a single system to share incident information in real-time, rather than through separate communication channels. MASAS uses an open architecture and standards to allow various response tools and agencies to share information on a common operating picture in order to improve coordination and response times. The system has gained recognition in Canada as a national priority for public safety and has expanded to include over 225 agencies and organizations.
Thailand's disaster information systems are not well coordinated and do not effectively support decision makers. Government data and information is often stored in documents like Word files that are difficult to access and analyze computationally. There is a need for Thailand to create a single, open disaster data system that consolidates information from different ministries and makes the data available via open standards and APIs to support uses like mobile apps, analysis tools, and crowdsourcing systems. This could help improve coordination, access to information, and disaster response efforts in Thailand.
Presentation given by IFAC Executive Director, Governance and Operations, Alta Prinsloo at the South African Institute of Professional Accountants National Conference. Presentation details the current status of integrated reporting globally and in Africa and how the issue is playing out for small- and medium-sized entities (SMEs).
A short introduction to GEO governance, the GEO Work Programme and the GEO community for the FOSS4G audience. Contributions on GEOGLOWS, eShape and GEOHack19 from Julia Wagemann, Valentina Balcan and Diana Mastracci.
This document summarizes a checklist for assessing the readiness of a spatial data infrastructure (SDI). It covers key components such as understanding spatial data holdings and requirements, developing an SDI vision and strategy, policy readiness, and collaboration. The checklist contains questions in each area to help evaluate an SDI's maturity and guide its further implementation, focusing on issues like formal information audits, stakeholder engagement, performance indicators, costs, policy details, and cross-border data sharing.
Open Government Data - Security Risk or mean for Threat PreventionJohann Höchtl
This document discusses open government data and its associated security risks and future prospects. It begins by outlining the political mindset of transparency that drives open data initiatives. It then defines open government data and provides examples. Reasons for open data include increased transparency, efficiency and trust in government. However, publicly releasing certain data sets could enable threats like targeting critical infrastructure or planning attacks. Future research is needed to assess security risks and how to select and release data to maximize benefits while mitigating risks. The document concludes by discussing a model for evaluating open data initiatives based on their overall security and welfare impacts.
Stream reasoning: mastering the velocity and the variety dimensions of Big Da...Emanuele Della Valle
More and more applications require real-time processing of heterogeneous data streams. In terms of the “Vs” of Big Data (volume, velocity, variety and veracity), they require addressing velocity and variety at the same time. Big Data solutions able to handle separately velocity and variety have been around for a while, but only Stream Reasoning approaches those two dimensions at once. Current results in the Stream Reasoning field are relevant for application areas that require to: handle massive datasets, process data streams on the fly, cope with heterogeneous incomplete and noisy data, provide reactive answers, support fine-grained information access, and integrate complex domain models. This talk starting from those requirements, frames the problem addressed by Stream Reasoning. It poses the research question and operationalise it with four simpler sub-questions. It describes how the database group of Politecnico di Milano positively answered those sub-questions in the last 7 years of research. It briefly surveys alternative approaches investigated by other research groups world wide and it elaborates on current limitations and open challenges.
Facilitating a Digital Commons with Free and Open Source Software: Paving the...Sameer Verma
This document summarizes a presentation given by Sameer Verma on facilitating a digital commons using free and open source software (FOSS). It introduces San Francisco State University and provides an overview of FOSS, including definitions of free software and open source software. It discusses FOSS licensing and provides examples of popular FOSS applications and projects. It also addresses FOSS adoption policies in Southeast Asian countries and the use of FOSS for educational purposes.
not the same. Wisdom is
1) Organizations have lost millions due to poor data management practices but remain unaware of the root causes. the ability to apply one's
2) Unless the costs of poor data management are quantified, gaining approval for basic investments will be difficult. knowledge and experience
3) The talk illustrates how to identify specific costs of poor practices in HR, finance, supply chain, and compliance with good judgment.
to show data management as the root cause of problems and gain support for required investments.
Statbel is the national statistical institute of Belgium and a member of the European Statistical System. Big data presents new opportunities and challenges for official statistics. Statbel has established a Big Data Team and conducted projects using mobile phone data, web scraping, and satellite imagery. Key challenges include gaining access to proprietary data sources and developing statistical methodology for large, unstructured datasets. Smart statistics that integrate multiple data sources in real time could provide detailed monitoring systems for issues like air quality and population estimates. Statbel aims to further develop use cases for big data and may need to collaborate with data scientists or hire them internally to fully leverage big data.
The document discusses the Open Government Licence, noting that it allows for both commercial and non-commercial free use of public sector information with simple conditions of attribution and prohibiting misuse. It also notes that the licence is machine-readable and interoperable with other models like Creative Commons licenses. The document raises the benefits of machine-readable licenses for efficient information transfer and having a shared vocabulary, while also acknowledging there are still challenges to address and opportunities for the future of open licensing.
This document discusses privacy by design (PbD). PbD requires building privacy protections directly into systems and practices through principles like data protection by default and design. It involves implementing privacy-enhancing technologies and tools to empower users. One example is an identity protector that uses pseudonymization. PbD faces challenges in implementation due to lack of economic incentives and legacy systems. The document recommends a toolbox approach to PbD, with privacy impact assessments and patterns, as well as certification and standards to facilitate adoption.
This document summarizes Polish law regarding the reuse of public sector information and its relationship to personal data protection. Key points include:
- Poland implemented the PSI Directive through an Access to Public Information Act which allows free or low-cost reuse of public information.
- Reusing public information often involves processing personal data, so reusers must comply with Poland's Personal Data Protection Act.
- There are legal controversies around complying with data protection principles like purpose limitation when reusing data for a new purpose than it was originally collected for.
- Notification obligations on new controllers who reuse personal data can be burdensome, but also enable data subject rights, though exemptions may apply for disproportionate efforts
This document summarizes a presentation given by Wojciech Wiewiórowski on privacy and open data. The presentation discusses the tension between privacy and the reuse of public sector information, including concerns about personal data being used to create profiles of individuals without their consent. It also reviews relevant EU directives and recommendations from the Council of Europe on issues like transparency around profiling and individuals' rights to access and correct personal data used in profiling.
Andrew Byrd is a lead developer of OpenTripPlanner, an open source multi-modal trip planning system. He is a PhD candidate studying urbanism. OpenTripPlanner uses open data standards like GTFS and OSM to plan trips by public transit, bicycle, or other modes. It has been deployed in several countries and powers mobile apps. Byrd's role involves adapting the trip planner for research and urban planning applications.
This document provides links to two websites related to open transport initiatives - epsiplatform.eu/transport which contains the full version of information on an open transport project, and transport.okfn.org which allows joining an open knowledge foundation open transport project.
This document discusses the benefits and challenges of open data for public transportation organizations. It describes how Rejseplanen, the Danish national journey planner, shares data openly which has led to partnerships and third-party applications. While there are questions around losing control and customers, the experience has shown that open data can drive more demand and ridership by empowering developers. The conclusion is that organizations should share their public transportation data openly and wholeheartedly in order to co-create value for customers.
This document summarizes Dorota Szeligowska's presentation on transport data from local to national to European levels. It discusses the EU policy framework for intelligent transport systems (ITS), including the ITS Action Plan and ITS Directive, which aim to coordinate ITS deployment, make road transport more sustainable, and integrate vehicle and transport infrastructure. It also describes ongoing work to promote multimodal travel planning and information services across Europe through specifications for EU-wide multimodal travel information and potential establishment of a stakeholder platform for common solutions like smart ticketing.
The document discusses open data and liability, specifically analyzing relevant case law. It examines cases where publishers of special purpose maps and data were acquitted of liability as well as cases where liability was found or remanded for maps and endorsements. The document was presented by Dr. Christian Laux at the 2013 ePSI conference on getting stakeholders to support open data initiatives.
The document discusses liability issues related to open data and summarizes a legal case about a shipwreck. In the case, claims were made that notice to mariners did not include information about a sandbank and that navigators could not be expected to have the latest chart. However, the final decision found no negligence, and determined that any professional navigator would have had the latest chart and that the shipwreck was actually due to the condition of the ship. The document also notes that liability for data is not just a legal issue, as governments may pay due to potential bad publicity from lawsuits even if they are not at fault.
The document discusses how open data and big data can be used to create value through new business models and transformation. It provides examples of how Socrata helped organizations unlock value from their data through open data strategies like interactive data experiences, APIs, custom apps, and data visualization. The use of open data APIs and a cloud-based infrastructure are presented as best practices for enabling developers and businesses to access and reuse organizational data.
The Non-Governmental Centre on Access to Public Information lobbies public authorities to expand access to information and promotes the right to information. It gathers expertise on freedom of information to help citizens request and access public information within legal timeframes, though some authorities improperly assume requests indicate intent to reuse data and deny access. The organization aims to make information access procedures more practical and motivate developing potential of public sector information.
E psi tomek-zielinski-transportoid-conference-slidesePSI Platform
Transportoid is a popular mobile app in Poland that provides public transportation timetables for 60 cities. It obtains data through web scraping, which can be inefficient and miss important information for complex routes. For the city of Kraków, the carrier and authorities refuse to share official data, even though the creator won court cases requiring them to do so. The summary argues that the law needs to change to ensure frequent access to dynamic public transportation data.
MojaPolis is a web service and database that provides easy access to socio-economic data for regions, sub-regions, districts, and municipalities in Poland through maps, rankings, charts, and other visualizations. Its main goals are to open public data, gather data from different sources, and give free access to data to strengthen public debate. Its target groups are active citizens, NGO leaders, decision-makers, academics, students, and journalists, and its main functions are information, analysis, education, planning, evaluation, and advocacy.
The document discusses a new transparency law passed in Hamburg, Germany through a citizen initiative process. The law aims to make government information more openly accessible to citizens by default online, rather than requiring freedom of information requests. It establishes categories of information that must be proactively published, including government decisions, contracts, expert advice, and statistics. The law was passed unanimously by parliament after receiving over 62,000 signatures in support. It will be implemented in phases, with full implementation targeted for October 2014.
The document discusses the European PSI Scoreboard, which uses crowdsourced data and non-discerning indicators to monitor the PSI re-use "ecology" across Europe. It is not intended to monitor governments, but rather focuses on 7 groups of indicators related to PSI re-use obstacles. Upcoming events are also mentioned, including a conference on February 22 in Warsaw about the European PSI Scoreboard.
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/how-axelera-ai-uses-digital-compute-in-memory-to-deliver-fast-and-energy-efficient-computer-vision-a-presentation-from-axelera-ai/
Bram Verhoef, Head of Machine Learning at Axelera AI, presents the “How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-efficient Computer Vision” tutorial at the May 2024 Embedded Vision Summit.
As artificial intelligence inference transitions from cloud environments to edge locations, computer vision applications achieve heightened responsiveness, reliability and privacy. This migration, however, introduces the challenge of operating within the stringent confines of resource constraints typical at the edge, including small form factors, low energy budgets and diminished memory and computational capacities. Axelera AI addresses these challenges through an innovative approach of performing digital computations within memory itself. This technique facilitates the realization of high-performance, energy-efficient and cost-effective computer vision capabilities at the thin and thick edge, extending the frontier of what is achievable with current technologies.
In this presentation, Verhoef unveils his company’s pioneering chip technology and demonstrates its capacity to deliver exceptional frames-per-second performance across a range of standard computer vision networks typical of applications in security, surveillance and the industrial sector. This shows that advanced computer vision can be accessible and efficient, even at the very edge of our technological ecosystem.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away