Designing the User Experience
User Experience Professionals Association (UXPA) defines user experience in this way:
Every aspect of the user’s interaction with a product, service, or company that make up the user’s perceptions of the whole. User experience design as a discipline is concerned with all the elements that together make up that interface, including layout, visual design, text, brand, sound, and interaction.
Additionally, the UX designer has the goal of making this user experience usable, useful, desirable, valuable, findable, credible and accessible. That’s a lot to keep in mind!
In this talk, Jason and Nadine will explain how UX designers gain an understanding of their users’ tasks (and the way they think about them), how they use this knowledge to design better UIs and better content, and finally how these designs are validated and evolved over time as users continue to interact with the product.
We’ll also briefly describe the deliverables often used by designers to communicate their work to clients, and how best to prepare yourself for engaging a user experience design agency to contribute to your product design process.
Jason Wehmhoener and Nadine Schaeffer, Cloudforest Design
Since 1996 Nadine Schaeffer and Jason Wehmhoener have been helping companies both large and small execute a user-centered design process. Our seasoned expertise in interaction design, information architecture, user research, visual design, and frontend engineering has aided many large and small companies launch successful products. Our clients have included Apple, Google, Yahoo, Plantronics, Cisco, Juniper Networks, Oracle, Adobe, Seagate, Citrix, Disney, Sunrun, Fiserv, E*Trade, Verizon, and many more.
Corporate Graphics Direct Marketing Solutions provides comprehensive direct mail services including mailing list management, printing, data processing, and mailing. They have over 50 years of experience in direct marketing and offer various printing and mailing solutions like letters, postcards, statements, and booklets. Their services aim to deliver direct mail pieces accurately and cost-effectively.
Final version of the slide set for my talk at the September 2012 meeting of the UK Chapter of the International Society for Knowledge Organization: ‘The Shape of Knowledge’. These slides are slightly different for the ones shown on the day, and have a re-recorded narrative.
Este documento habla sobre los diferentes tipos de animales, como los domésticos y los salvajes, y los métodos para cuidarlos adecuadamente. Debemos proveerles de alimento, agua, refugio y limpieza de acuerdo a sus necesidades para garantizar su bienestar.
This document discusses how to take an agile approach to data warehouse projects. It introduces agile practices like iterative development, minimal inventory, and frequent delivery that can be applied. It proposes using both a normalized and dimensional data model to validate understanding of the data and business domains. Visualization tools like kanban boards and thermometers are recommended. Version control is key to integrate the data model with the rest of the project. The "Spock approach" combines relational and dimensional modeling in a hybrid method.
This document discusses an agile approach to developing a data warehouse. It advocates using an Agile Enterprise Data Model to provide vision and guidance. The "Spock Approach" is described, which uses an operational data store, dimensional data warehouse, and iterative development of data marts. Data visualization techniques like data hexes are recommended to improve planning and visibility. Leadership, version control, adaptability, refinement, and refactoring are identified as important ongoing processes for an agile data warehouse project.
Liana Underwood GIS Project Management 16 April 14Liana Underwood
This document discusses best practices for managing GIS projects. It begins by defining GIS as a system for capturing, storing, manipulating, analyzing, managing and presenting geographic data. It then outlines key factors for GIS project success, including defining a clear vision and success criteria, involving stakeholders, thorough requirements gathering, managing change, risk management, using a phased approach, communication, not getting distracted by technology, engaging IT, and keeping projects simple. A variety of industries that use GIS are also listed.
I of F South West Spring Conference 2012Purple Vision
The document provides 12 tips for making the most of a database, including understanding what data is currently known and unknown, tracking supporter journeys, cleansing data, taking a single view of supporters, segmentation, integrating email marketing, social media integration, reporting, improving business processes, data mining, engagement tracking, and profiling supporters. It discusses each tip in more detail and provides examples and recommendations for non-profit organizations to better utilize their supporter databases.
Designing the User Experience
User Experience Professionals Association (UXPA) defines user experience in this way:
Every aspect of the user’s interaction with a product, service, or company that make up the user’s perceptions of the whole. User experience design as a discipline is concerned with all the elements that together make up that interface, including layout, visual design, text, brand, sound, and interaction.
Additionally, the UX designer has the goal of making this user experience usable, useful, desirable, valuable, findable, credible and accessible. That’s a lot to keep in mind!
In this talk, Jason and Nadine will explain how UX designers gain an understanding of their users’ tasks (and the way they think about them), how they use this knowledge to design better UIs and better content, and finally how these designs are validated and evolved over time as users continue to interact with the product.
We’ll also briefly describe the deliverables often used by designers to communicate their work to clients, and how best to prepare yourself for engaging a user experience design agency to contribute to your product design process.
Jason Wehmhoener and Nadine Schaeffer, Cloudforest Design
Since 1996 Nadine Schaeffer and Jason Wehmhoener have been helping companies both large and small execute a user-centered design process. Our seasoned expertise in interaction design, information architecture, user research, visual design, and frontend engineering has aided many large and small companies launch successful products. Our clients have included Apple, Google, Yahoo, Plantronics, Cisco, Juniper Networks, Oracle, Adobe, Seagate, Citrix, Disney, Sunrun, Fiserv, E*Trade, Verizon, and many more.
Corporate Graphics Direct Marketing Solutions provides comprehensive direct mail services including mailing list management, printing, data processing, and mailing. They have over 50 years of experience in direct marketing and offer various printing and mailing solutions like letters, postcards, statements, and booklets. Their services aim to deliver direct mail pieces accurately and cost-effectively.
Final version of the slide set for my talk at the September 2012 meeting of the UK Chapter of the International Society for Knowledge Organization: ‘The Shape of Knowledge’. These slides are slightly different for the ones shown on the day, and have a re-recorded narrative.
Este documento habla sobre los diferentes tipos de animales, como los domésticos y los salvajes, y los métodos para cuidarlos adecuadamente. Debemos proveerles de alimento, agua, refugio y limpieza de acuerdo a sus necesidades para garantizar su bienestar.
This document discusses how to take an agile approach to data warehouse projects. It introduces agile practices like iterative development, minimal inventory, and frequent delivery that can be applied. It proposes using both a normalized and dimensional data model to validate understanding of the data and business domains. Visualization tools like kanban boards and thermometers are recommended. Version control is key to integrate the data model with the rest of the project. The "Spock approach" combines relational and dimensional modeling in a hybrid method.
This document discusses an agile approach to developing a data warehouse. It advocates using an Agile Enterprise Data Model to provide vision and guidance. The "Spock Approach" is described, which uses an operational data store, dimensional data warehouse, and iterative development of data marts. Data visualization techniques like data hexes are recommended to improve planning and visibility. Leadership, version control, adaptability, refinement, and refactoring are identified as important ongoing processes for an agile data warehouse project.
Liana Underwood GIS Project Management 16 April 14Liana Underwood
This document discusses best practices for managing GIS projects. It begins by defining GIS as a system for capturing, storing, manipulating, analyzing, managing and presenting geographic data. It then outlines key factors for GIS project success, including defining a clear vision and success criteria, involving stakeholders, thorough requirements gathering, managing change, risk management, using a phased approach, communication, not getting distracted by technology, engaging IT, and keeping projects simple. A variety of industries that use GIS are also listed.
I of F South West Spring Conference 2012Purple Vision
The document provides 12 tips for making the most of a database, including understanding what data is currently known and unknown, tracking supporter journeys, cleansing data, taking a single view of supporters, segmentation, integrating email marketing, social media integration, reporting, improving business processes, data mining, engagement tracking, and profiling supporters. It discusses each tip in more detail and provides examples and recommendations for non-profit organizations to better utilize their supporter databases.
This document provides an agenda and summaries of presentations on Audience Builder and IMH Reporting. The first presentation gives an overview of Audience Builder's segmentation capabilities and consolidated data. The second is a case study on how threadless uses Audience Builder for personalized messaging. The third presents a case study on using Discover, a new IMH reporting tool, for data-driven insights. The last discusses best practices for working with customer data, including stakeholder involvement, data readiness, and post-deployment planning. The document concludes with an open question and answer session.
Explore visualization for user experience, information architecture, and interaction design, including tools and when and how to use them. (UPA 2011 - Usability Fundamentals Track)
This is the presentation given to the Centre for Research in Social Simulation (CRESS) at the University of Surrey which hosted a day of presentations on agent-based simulation models that have already led to or are close to leading to influencing decision makers in a range of application areas, including healthcare, consultancy and economics. The event builds on the previous meeting of the Simulation SIG that compared DES and SD, as well as a stream at the OR Society\'s 2010 Simulation Workshop, and a recent special issue of the Journal of Simulation.
Andrii Belas "Modern approaches to working with categorical data in machine l...Lviv Startup Club
- Andrey Belas is an AI Solution Architect at SMART business and expert in machine learning and public speaker
- He created and mentors the SMART Data Science Academy and is responsible for the technical development of the data science team and architecture of all data science projects at SMART business
- He has Microsoft certifications in areas like Big Data and Advanced Analytics, Cloud Data Science with Azure Machine Learning, and Developing SQL Data Models
- He has experience in domains like Deep Learning, Computer Vision, AI in Forecasting, AI in Marketing, and Risk Management
Shutterfly and Dell: Driving B-to-B ROI through Directed ContentVivastream
Shutterfly and Dell are partnering to drive ROI through directed content and personalized communications. Testing showed targeting messaging and trigger cadence based on customers' first purchase can increase response rates by 48-700% and revenue per mailed piece. Lessons learned are that the welcome period is a time period, not a campaign, and analysis needs to be over the full period. Shutterfly provides end-to-end direct marketing solutions for Dell including creative development, production management, and using data to deliver personalized communications.
The document identifies several conceptual inhibitors to effective information sharing between departments and agencies. These include having an unclear scope that does not account for budget, participants, end products, and other constraints. Poor information architecture that lacks common organizing principles and taxonomy can also undermine a solution. Insufficient attention to design elements like templates, pages, and navigation can negatively impact how information is displayed and used. Failure to define required functionality up front based on participant needs also poses a conceptual risk.
The document provides an outline for a quantitative methods class. It includes an introduction to statistics, examples/problems, and a case study on price of flats in Bangalore. It also discusses data summarization and presentation methods such as forming class intervals, diagrammatic representations including bar charts and histograms, and computing summary statistics like mean, median, and mode from grouped and ungrouped data.
Condesign is a consulting firm that partners with Planisware, a global leader in project portfolio management (PPM) software. The document discusses why PPM is important, common pitfalls organizations face with PPM ("seven deadly sins"), and provides a four-step process for implementing an effective PPM system: analyze the current approach, galvanize support for changes, standardize processes and metrics, and optimize the portfolio through iterative reviews. The presentation emphasizes engaging stakeholders, defining clear governance models, and continuously improving PPM practices to balance corporate oversight with local business needs.
Is Your Marketing Database "Model Ready"?Vivastream
The document provides guidance on designing marketing databases to support advanced analytics and predictive modeling. It discusses the importance of collecting the right data ingredients, summarizing and categorizing variables, and ensuring consistency. Different types of analytics and variables are described, along with challenges in implementing models and what a "model-ready" database environment entails.
This document discusses logical modeling and design for data warehouses. It introduces the multidimensional model, which organizes data into facts, measures, and dimensions. Facts represent important data to analyze, measures are properties that can be aggregated, and dimensions provide context. Star and snowflake schemas are common implementations of this model in relational databases. The document outlines the process of conceptual design, logical design, and physical design to transform conceptual models into optimized logical and physical schemas for a data warehouse.
SQL Server and Azure Mobile Business IntelligenceJen Stirrup
This document provides an overview of mobilizing business intelligence. It discusses using SharePoint, Windows Azure for SQL Server, and third party tools to deliver BI to mobile devices now. It also offers tips for data visualization for mobile BI, including using filters, drill downs, tooltips, and being aware of loading times and illusory effects on mobile. The presenter aims to show how everyone can access BI on their mobile devices using existing tools and platforms.
Is Your Marketing Database "Model Ready"?Vivastream
The document provides guidance on designing marketing databases to support advanced analytics and predictive modeling. It emphasizes the importance of cleaning and summarizing raw data into descriptive variables matched to the level that needs to be ranked, such as individuals or households. Transaction and customer history data should be converted into summary descriptors like recency, frequency, and monetary variables. This prepares the data for predictive modeling to increase targeting accuracy, reduce costs, and reveal patterns. Consistency in data preparation is highlighted as key for modeling effectiveness.
Data Visualization dataviz superpower! Guidelines on using best practice data visualization principles for Power BI, Excel, SSRS, Tableau and other great tools!
Data to Dollars™ - Practical Analytics in the Big Data Era Jaime Fitzgerald A...Fitzgerald Analytics, Inc.
The document discusses an upcoming webinar hosted by the Financial Services Industry User Group (FSIUG) on turning data into dollars in the era of big data. The webinar will feature Jaime Fitzgerald, founder of Fitzgerald Analytics, and will cover how to avoid common pitfalls of managing large data volumes, leverage big data opportunities, and generate ROI from big data initiatives. The webinar agenda includes introductions from the FSIUG president and an education specialist, followed by Fitzgerald's presentation and a question and answer session.
IBM Innovate 2010 - Case Study Presentation 1007 Final W Notesgaryrubinstein
The document summarizes a presentation about conducting a gap analysis for a government department client. It notes that the client has many ongoing projects that require integration and focuses on quality using PRINCE2 and internal frameworks. It was looking for a gap analysis of its methodology, processes, documentation, quality, capabilities, people and software. The presentation discusses workshops and interviews conducted as part of the analysis approach. Key findings from applying different thinking styles are discussed around facts, benefits, difficulties, and alternatives. The presentation recommends improvements to processes, people, continuous improvement, and tools. It notes challenges around culture and the need for strong communication and stakeholder buy-in to support changes.
The document discusses the importance of data visualization for businesses. It outlines several key challenges of data visualization, including understanding the data, ensuring data quality, displaying the data in a readable way, and dealing with outliers. The document emphasizes that data visualization allows businesses to easily understand and gain insights from their data in order to gain strategic advantages.
This document discusses developing business intelligence capabilities at postal organizations. It notes that postal organizations face declining mail volumes but growing package volumes, as well as increased competition. It then outlines that business intelligence is needed to support product and service decisions. The document describes evaluating USPS's business intelligence capabilities and identifying best practices from other organizations. It categorizes business intelligence into customer, market, and competitive intelligence. Finally, it discusses organizing business intelligence and moving forward with establishing capabilities at postal organizations.
ETE 2013: Going Big with Big Data...one step at a timeAnita Andrews
This document provides an overview of using data analytics to optimize business performance. It begins with introductions and defines "big data." It then discusses how most companies are not truly leveraging big data and challenges they face. The document recommends doing an initial assessment of goals, team capabilities, data sources, and tools. It suggests starting with a small, focused data set to quickly test the analytics process and prove value. Finally, it outlines two approaches to optimization: funnel optimization and "Russian doll" optimization, where differentiating characteristics of high-performing users or items are used to improve lower-performing ones in an iterative process. The key messages are doing analytics incrementally, interpreting data intelligently, and using insights to optimize measurable
This document provides an agenda and summaries of presentations on Audience Builder and IMH Reporting. The first presentation gives an overview of Audience Builder's segmentation capabilities and consolidated data. The second is a case study on how threadless uses Audience Builder for personalized messaging. The third presents a case study on using Discover, a new IMH reporting tool, for data-driven insights. The last discusses best practices for working with customer data, including stakeholder involvement, data readiness, and post-deployment planning. The document concludes with an open question and answer session.
Explore visualization for user experience, information architecture, and interaction design, including tools and when and how to use them. (UPA 2011 - Usability Fundamentals Track)
This is the presentation given to the Centre for Research in Social Simulation (CRESS) at the University of Surrey which hosted a day of presentations on agent-based simulation models that have already led to or are close to leading to influencing decision makers in a range of application areas, including healthcare, consultancy and economics. The event builds on the previous meeting of the Simulation SIG that compared DES and SD, as well as a stream at the OR Society\'s 2010 Simulation Workshop, and a recent special issue of the Journal of Simulation.
Andrii Belas "Modern approaches to working with categorical data in machine l...Lviv Startup Club
- Andrey Belas is an AI Solution Architect at SMART business and expert in machine learning and public speaker
- He created and mentors the SMART Data Science Academy and is responsible for the technical development of the data science team and architecture of all data science projects at SMART business
- He has Microsoft certifications in areas like Big Data and Advanced Analytics, Cloud Data Science with Azure Machine Learning, and Developing SQL Data Models
- He has experience in domains like Deep Learning, Computer Vision, AI in Forecasting, AI in Marketing, and Risk Management
Shutterfly and Dell: Driving B-to-B ROI through Directed ContentVivastream
Shutterfly and Dell are partnering to drive ROI through directed content and personalized communications. Testing showed targeting messaging and trigger cadence based on customers' first purchase can increase response rates by 48-700% and revenue per mailed piece. Lessons learned are that the welcome period is a time period, not a campaign, and analysis needs to be over the full period. Shutterfly provides end-to-end direct marketing solutions for Dell including creative development, production management, and using data to deliver personalized communications.
The document identifies several conceptual inhibitors to effective information sharing between departments and agencies. These include having an unclear scope that does not account for budget, participants, end products, and other constraints. Poor information architecture that lacks common organizing principles and taxonomy can also undermine a solution. Insufficient attention to design elements like templates, pages, and navigation can negatively impact how information is displayed and used. Failure to define required functionality up front based on participant needs also poses a conceptual risk.
The document provides an outline for a quantitative methods class. It includes an introduction to statistics, examples/problems, and a case study on price of flats in Bangalore. It also discusses data summarization and presentation methods such as forming class intervals, diagrammatic representations including bar charts and histograms, and computing summary statistics like mean, median, and mode from grouped and ungrouped data.
Condesign is a consulting firm that partners with Planisware, a global leader in project portfolio management (PPM) software. The document discusses why PPM is important, common pitfalls organizations face with PPM ("seven deadly sins"), and provides a four-step process for implementing an effective PPM system: analyze the current approach, galvanize support for changes, standardize processes and metrics, and optimize the portfolio through iterative reviews. The presentation emphasizes engaging stakeholders, defining clear governance models, and continuously improving PPM practices to balance corporate oversight with local business needs.
Is Your Marketing Database "Model Ready"?Vivastream
The document provides guidance on designing marketing databases to support advanced analytics and predictive modeling. It discusses the importance of collecting the right data ingredients, summarizing and categorizing variables, and ensuring consistency. Different types of analytics and variables are described, along with challenges in implementing models and what a "model-ready" database environment entails.
This document discusses logical modeling and design for data warehouses. It introduces the multidimensional model, which organizes data into facts, measures, and dimensions. Facts represent important data to analyze, measures are properties that can be aggregated, and dimensions provide context. Star and snowflake schemas are common implementations of this model in relational databases. The document outlines the process of conceptual design, logical design, and physical design to transform conceptual models into optimized logical and physical schemas for a data warehouse.
SQL Server and Azure Mobile Business IntelligenceJen Stirrup
This document provides an overview of mobilizing business intelligence. It discusses using SharePoint, Windows Azure for SQL Server, and third party tools to deliver BI to mobile devices now. It also offers tips for data visualization for mobile BI, including using filters, drill downs, tooltips, and being aware of loading times and illusory effects on mobile. The presenter aims to show how everyone can access BI on their mobile devices using existing tools and platforms.
Is Your Marketing Database "Model Ready"?Vivastream
The document provides guidance on designing marketing databases to support advanced analytics and predictive modeling. It emphasizes the importance of cleaning and summarizing raw data into descriptive variables matched to the level that needs to be ranked, such as individuals or households. Transaction and customer history data should be converted into summary descriptors like recency, frequency, and monetary variables. This prepares the data for predictive modeling to increase targeting accuracy, reduce costs, and reveal patterns. Consistency in data preparation is highlighted as key for modeling effectiveness.
Data Visualization dataviz superpower! Guidelines on using best practice data visualization principles for Power BI, Excel, SSRS, Tableau and other great tools!
Data to Dollars™ - Practical Analytics in the Big Data Era Jaime Fitzgerald A...Fitzgerald Analytics, Inc.
The document discusses an upcoming webinar hosted by the Financial Services Industry User Group (FSIUG) on turning data into dollars in the era of big data. The webinar will feature Jaime Fitzgerald, founder of Fitzgerald Analytics, and will cover how to avoid common pitfalls of managing large data volumes, leverage big data opportunities, and generate ROI from big data initiatives. The webinar agenda includes introductions from the FSIUG president and an education specialist, followed by Fitzgerald's presentation and a question and answer session.
IBM Innovate 2010 - Case Study Presentation 1007 Final W Notesgaryrubinstein
The document summarizes a presentation about conducting a gap analysis for a government department client. It notes that the client has many ongoing projects that require integration and focuses on quality using PRINCE2 and internal frameworks. It was looking for a gap analysis of its methodology, processes, documentation, quality, capabilities, people and software. The presentation discusses workshops and interviews conducted as part of the analysis approach. Key findings from applying different thinking styles are discussed around facts, benefits, difficulties, and alternatives. The presentation recommends improvements to processes, people, continuous improvement, and tools. It notes challenges around culture and the need for strong communication and stakeholder buy-in to support changes.
The document discusses the importance of data visualization for businesses. It outlines several key challenges of data visualization, including understanding the data, ensuring data quality, displaying the data in a readable way, and dealing with outliers. The document emphasizes that data visualization allows businesses to easily understand and gain insights from their data in order to gain strategic advantages.
This document discusses developing business intelligence capabilities at postal organizations. It notes that postal organizations face declining mail volumes but growing package volumes, as well as increased competition. It then outlines that business intelligence is needed to support product and service decisions. The document describes evaluating USPS's business intelligence capabilities and identifying best practices from other organizations. It categorizes business intelligence into customer, market, and competitive intelligence. Finally, it discusses organizing business intelligence and moving forward with establishing capabilities at postal organizations.
ETE 2013: Going Big with Big Data...one step at a timeAnita Andrews
This document provides an overview of using data analytics to optimize business performance. It begins with introductions and defines "big data." It then discusses how most companies are not truly leveraging big data and challenges they face. The document recommends doing an initial assessment of goals, team capabilities, data sources, and tools. It suggests starting with a small, focused data set to quickly test the analytics process and prove value. Finally, it outlines two approaches to optimization: funnel optimization and "Russian doll" optimization, where differentiating characteristics of high-performing users or items are used to improve lower-performing ones in an iterative process. The key messages are doing analytics incrementally, interpreting data intelligently, and using insights to optimize measurable
Similar to Why Data Visualization is Important in Delivering Actionable Insight (20)
Leading Practices in Multi-Pillar Oracle Cloud ImplementationsAlithya
The document outlines an agenda for a presentation on leading practices in multi-pillar Oracle Cloud implementations. The agenda includes introductions, an Oracle Cloud introduction, typical paths to moving applications to the cloud, customer stories, and Alithya's methodology for multi-pillar Oracle Cloud implementations. It also discusses integrations, conversions, reporting, and master data management.
Why and How to Implement Operation Transfer Pricing (OTP) with Oracle EPM Cloud Alithya
MUFG implemented an operational transfer pricing (OTP) solution using Oracle Cloud EPM to address challenges with their legacy system. Key pain points with the old system included time-consuming and error-prone data input, limited calculation and modeling capabilities, and manual reporting. The new Cloud EPM solution provides automated, self-service processing and analysis to improve efficiency, accuracy and transparency of MUFG's OTP process.
How to Deploy & Integrate Oracle EPM Cloud Profitability and Cost Management ...Alithya
Oracle EPM Cloud Profitability and Cost Management (PCM) enables detailed cost and revenue allocations to provide accurate profitability reporting at various levels. PCM can be deployed as a sub-ledger to the general ledger to facilitate strategic profit and loss reporting, management decision making, and organizational behavior changes. Two-way integration between PCM and the general ledger ensures allocation details from PCM balance with general ledger accounts.
Workforce Plus: Tips and Tricks to Give Workforce an Extra Kick! Alithya
This document provides tips and tricks for configuring the Oracle Workforce module. It discusses dimensionality considerations when building a workforce application and outlines the different planning granularity options. It also covers scenario handling, compensation calculations including base salary, additional earnings, benefits and taxes. Configuration of components, options, and tiers is explained. Custom dimensions, calculations, and expense logic are discussed. Effective use of out-of-the-box functionality versus custom configurations is recommended.
How to Allocate Your Close Time More EffectivelyAlithya
This document summarizes a presentation on how to more effectively allocate time during the financial close process. It discusses typical close timelines, tracking close progress, and opportunities to improve the process. The presentation covers how allocations can impact the close cycle and provides use cases from insurance, financial services, and manufacturing industries. It suggests starting the improvement process by considering reporting needs and complexity, then offers tips like leveraging automation and master data relationships.
Viasat Launches to the Cloud with Oracle Enterprise Data Management Alithya
The document discusses Viasat's implementation of Oracle Enterprise Data Management Cloud Services to improve its master data management processes. Previously, Viasat had a manual process for metadata changes that could take days or months to complete. With EDMCS, Viasat now has a single tool for managing master data across Oracle E-Business Suite, Hyperion Planning, and Workday. Key benefits include improved data governance, reduced change time, and better integration between systems. Alithya helped design and advise on the implementation during 2020-2021.
How Do I Love Cash Flow? Let Me Count the Ways… Alithya
This document discusses best practices for configuring and implementing a cash flow statement framework in Oracle Hyperion Financial Management (HFM). It provides an overview of key concepts like dimension configuration, data loading and mapping, calculations, and reporting. The presentation covers the indirect cash flow method and how the framework facilitates translation and eliminations at low levels.
Legg Mason’s Enterprise, Profit Driven Quest with Oracle EPM CloudAlithya
This presentation was given at Oracle Modern Business Experience, on 3/21/2019 by Wil Adkins, Managing Director FP&A, Legg Mason and Mike Killeen, SVP, Technology & Strategy, Alithya
Supply Chain Advisory and MMIS System Oracle ImplementationAlithya
Many healthcare systems are experiencing rising operation costs and expenses due to operational inefficiencies, making it more challenging — and expensive — to provide quality patient care and engagements.
In this workshop, learn how to establish clear opportunities to optimize your labor, processes, procedures, technology, and inventory costs and a vision to integrate and optimize your current technology investments.
Become an organization that reinvents itself to accelerate efficiencies and cost reduction strategies using digital transformation to provide integrated systems and solutions for an immediate impact.
Digital Transformation in Healthcare: Journey to Oracle Cloud for Integrated,...Alithya
Healthcare has always been a data-driven industry. Until recently, many healthcare institutions struggled with disparate data in siloed enterprise systems because an integrated Cloud solution wasn’t widely available across Enterprise Resource Planning (ERP), Human Capital Management (HCM), and Enterprise Performance Management (EPM) processes. Today more than ever, it is critical that health systems have the ability to deliver quality care at the right cost. Equally important is the ability to remain agile and adapt to events around them. Without the right foundation in place – these things are impossible. That is why digital transformation is the future of healthcare.
Healthcare has always been a data-driven industry. Until recently, many healthcare institutions struggled with disparate data in siloed enterprise systems because an integrated Cloud solution wasn’t widely available across Enterprise Resource Planning (ERP), Human Capital Management (HCM), and Enterprise Performance Management (EPM) processes. Today more than ever, it is critical that health systems have the ability to deliver quality care at the right cost. Equally important is the ability to remain agile and adapt to events around them. Without the right foundation in place – these things are impossible. That is why digital transformation is the future of healthcare.
In this session, attendees learn:
• Establishing the foundation in the chart of accounts. Visualize how the right COA design can provide actionable analytics.
• Gain an understanding of how financials can directly connect with an EPM-focused solution like Planning to reduce manual processes and increase the efficiency of financial operations teams, therefore, assisting in bringing about real cost transparency to your organization.
• Finally, see how a Human Capital Management (HCM) and Payroll solution integrates seamlessly with the Planning process to provide better visibility into and reporting capabilities on the organization’s employees.
Join us as we embark on the Oracle Cloud digital transformation journey in pursuit of improving healthcare outcomes and bending the cost curve.
nter-pod Revolutions: Connected Enterprise Solution in Oracle EPM Cloud Alithya
The session will discuss a library of solutions implemented at clients for transferring between applications in separate pods. Each configuration has its own merits and use case. The four main categories that will be discussed are -
1. Trickle Feed - uses a combination of inter-pod REST API connection, data management load rule, groovy scripting and scheduled EPM Automate job on a jump server to pick-up the files from source and push to target.
2. Focused On-save Push - pushes an intersection from source to target using inter-pod REST API connection, data management load rule and groovy scripting.
3. Scheduled Push- uses a combination of windows or Linux job, inter-pod REST API connection, groovy scripting, data management load rule and EPM Automate commands to extract and push data en masse from source to target.
4. Json Extract and Load - uses a combination of groovy scripting and inter-pod REST API connection to extract and push an intersection on-save.
The audience will walk-away with learnings and understanding of inter-pod configurations, mainly for EPM Cloud planning applications. Snippets of code will form the "gold dust" takeaway from the session.
ODTUG Configuring Workforce: Employee? Job? or Both? Alithya
The document provides an overview of configuring Oracle's EPBCS Workforce module, including:
- The three levels of planning granularity - employee only, job only, or employee and job
- Enabling features such as expense planning, headcount planning, and workforce management
- Setting up custom dimensions, benefits/taxes/earnings, salary grades, and other metadata
- Preparing for planning and forecasting by loading data, setting assumptions and rates, and synchronizing data
Oracle Cloud Time and Labor: Default Payroll Rate, Override Rate and Flat Dol...Alithya
Presentation was given at ODTUG & OHUG HCM Week. Recorded on 11/6/2020 given by Karen N. Settembrino.
Review the configuration required to support showing the default rate on the Responsive UI and the Classic UI time card along with configuring the manager’s ability to override the rate when necessary. In addition, show the setup necessary to support flat dollar amount entries on the time card to pass to payroll.
AUSOUG I Am Paying for my Cloud License. What's Next?Alithya
The document provides an overview of a presentation on profitability and cost management using Oracle EPM Cloud. It discusses key capabilities like flexibility, accuracy, shared methodology, and transparency. It also highlights use cases and demonstrates features through a live demo. The presentation encourages organizations to evaluate where they are versus competitors and determine how to advance their profitability analysis and planning.
A Journey to Profitability with Oracle PCMCSAlithya
A presentation that highlights the easy and effective journey to profitability using the Oracle Profitability and Cost Management solution implemented by Alithya.
Presentation originally delivered by Evan Leffler, Lead Consultant, to a live audience at HUGmn Tech Day in Chaska, MN, 3/14/18.
So, you can write basic calculations or maybe even intermediate. Nearly every Essbase and Planning applications requires calculations. The better you are at calculations, the more business value your applications deliver. But what makes the difference between good enough and great? What makes your code more reliable, faster, and easier to understand and debug? What techniques can you use to think through challenges and come up with solutions? This session offers some time-tested approaches to writing better calc scripts. Topics include:
Getting the requirements straight, thoroughly but without wasting time
Taking advantage of dimensionality and block structure
Variables and parameterization
One script or two?
What needs to be commented?
Alternative approaches to some common challenges
This session is intended for Essbase BSO developers who want to think about what “better code” means and how to write it. It requires at least some knowledge of Essbase BSO calculations.
Hosted by Ron Moore at the ODTUG Learn from Home Series
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
2. Agenda
• Introduction
• Design examples
• Understanding the customer
• What makes good design
• How to apply this to reporting
• Summary and examples
• Questions
5. Design Examples
- Lots of images
- Attention not drawn
to specific items
- Too much color
- Visuals used when
charts or data could
convey messages
better
- Waste of valuable
screen real estate
6. Design Examples
- 3-D rendering
makes it harder to
interpret the values
- Forecasted Units
and Dollars lines
connect the regions,
when they are
distinct data points
- Dollars and Year
Ago Dollars are
stacked, when they
are also distinct
values
- Excessive use of
the word ‘Region’
7. Design Examples
- Recent example of
visuals over
substance and
meaning
- Very difficult to
determine
correlation between
circle sizes
8. Design Examples
- Dark images and
background distract
the viewer
- Cannot determine
trends with only 2
data points
- Confusing as 2010
values are not
circles
- Visuals used when
charts or data could
convey messages
better
9. Customer Maturity
Customers are on a
reporting journey and
determining what their
requirements and
future plans are
Scorecards important in
understanding where
they are, what their
Dashboards needs are and where
they think they are
going.
Mgmt Reports You must satisfy the
pre-requisites before
climbing up the
Operations pyramid.
Foundation
10. Customer Maturity
Areas to grow
Consider 3 different
part of the organization
at different stages of
maturity:
- Finance = Area 1
- Logistics = Area 2
- Manufacturing =
Area 3
Maintain current
operations
business
Area 1 Area 2 Area 3
12. Good design concepts
• Memory limits
• Encoding data for rapid perception
• Gestalt principles of perception
13. Memory Limits
• Iconic memory – visual cues, pre-conscious
/pre-attentive processing
• Short term memory – conscious
processing, 3-9 chunks only
• Long term memory
14. Data Encoding
- How many 3’s can
you find?
- As there is no
1723957695026398027384956012 encoding of data,
we process
9847536970898726547867925019 sequentially –
attentive processing
2005928976548102985079827158 - very slow!
0297456478597069873940588698
5726327189506972915069871256
2783789
15. Data Encoding
- 7 is the correct
answer
- Much easier to see
when data is in a
1723957695026398027384956012 different color, the
same goes for
9847536970898726547867925019 bolding, size, shape
and orientation
2005928976548102985079827158 changes as well
0297456478597069873940588698
5726327189506972915069871256
2783789
16. Gestalt Principles
Here we see:
• Proximity
- 2 groups rather than
7 blogs
- 2 different sets
within the groups
- 2 further groups
• Similarity within the groups
• Enclosure
17. Gestalt Principles
Our minds:
• Closure - Close
- Continue
- Link
even though these can
• Continuity be seen as discrete
items.
• Connection
18. Applying concepts
• Focus on the value add you are showing by organising
and minimising the data shown.
• Arrange information in a way that makes sense, making
sure that the important data stands out.
19. Edward R Tufte
• Tufte provided lots of thought around how we view and
perceive data
20. Data Ink Ratio
• Key concepts: reduce non data ink from graphics, focus
on the values
• Reduce graphic paraphernalia (chartjunk)
21. Chartjunk
• Which of these has clutter
and unnecessary items?
• Which is easier to see the
data?
22. Colin Ware
Colin Ware
Information Visualization -
Perception for Design, 2000
“We can easily see patterns
presented in certain ways, but if they
are presented in other ways they
become invisible.”
23. Colin Ware
We distinguish the
items if different in
terms of:
• Color - Color – either hue or
intensity
- Form – can also be
size, shape,
orientation
• Form
24. Colin Ware
We distinguish the
items if different in
terms of:
• Position - Position
- Motion:
flashing/moving
should only be used
for real time data or
issues requiring
immediate attention
• Motion
25. Stephen Few
• Combined previous theories and melded with current
designs and dashboard and communication ideas
26. Stephen Few
This simple example
shows how we can’t
easily compute size
variations in area.
The large circle is 16
time larger.
Pie and area charts
should not be used as
we cannot quickly
recognize the
differences.
27. How does this help us?
• Simplify – reduce the data presented
• Simplify – concentrate on important
information
• Simplify – remove unnecessary color and
distractions
28. Examples
Top example is very
difficult to read and
interpret numbers.
Bottom report is
cleaner and can easily
see the items without
distracting border and
shading.
29. Examples
Top example is very
distracting and difficult
to focus on areas that
need attention.
Bottom report is much
cleaner and easier to
see items needing
attention.
30. Design Examples
Much easier to
compare the different
market capitalizations
when presented as a
bar chart.
Also much easier to
see the best/worse if
ordered.
32. Design Examples
1,600 1,500.0
1,400
1,200
1,000
UK
800
Germany In charting the previous
600
North America graphic, it’s obvious
400
165.8 that there is no
US$ Millions
200 128.573.5
20.4 25.2 4.3 45.0 2.9 relationship between
0
2005 2009 2010 the years or countries.
Maybe a table would
1,600 1,500.0 have been better?
1,400
1,200
1,000
2005
800
2009
600
400 2010
128.5 165.8
73.5
US$ Millions
200 4.3 45.0 20.4 25.2 2.9
0
UK Germany North America
33. Edward R Tufte
The Visual Display of
Quantitative Information, 1983
“Graphical excellence is that which
gives the viewer the greatest
number of ideas in the shortest time,
with the least ink in the smallest
space.”