workshop for UXPA DC on April 12, 2014, entitled "All this UX data! Now what?" Attendees learned how to deal with large amounts of user experience data from tests, and how to combine certain data to tell a succinct story.
Buttons on forms and surveys: a look at some research 2012Caroline Jarrett
Does 'Submit' or 'Send' or 'OK' go to the left or right of 'Cancel'? Does 'Next' go to the left or right of 'Previous'? This talk at the Information Design Conference 2012 discusses three research studies on forms and surveys.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Buttons on forms and surveys: a look at some research 2012Caroline Jarrett
Does 'Submit' or 'Send' or 'OK' go to the left or right of 'Cancel'? Does 'Next' go to the left or right of 'Previous'? This talk at the Information Design Conference 2012 discusses three research studies on forms and surveys.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
User Experience (UX) Research in HealthcareDan Berlin
Healthcare companies should embrace iterative user research so that they may design products that aligns with their customers' wants and needs. UX research studies are not clinical trials - they are a means of learn how to best design a product for customers.
Creating a culture that provokes failure and boosts improvementBen Dressler
Everyone fails - but not everyone uses failed attempts as a source of learning and improvement. This talk outlines a framework to turn failure into gaining knowledge by understanding IF, HOW and WHY something fails.
Introduction to usability and usability testing as a discipline, followed by how to do guerilla usability testing. Presented at Duke Tech Expo April 13, 2018 with co-author Lauren Hirsh, with content from a prior collaborative presentation of hers.
Slides Ian Multon recently used in his discussion w/ mentees of The Product Mentor.
The Product Mentor is a program designed to pair Product Mentors and Mentees from around the World, across all industries, from start-up to enterprise, guided by the fundamental goals…Better Decisions. Better Products. Better Product People.
Throughout the program, each mentor leads a conversation in an area of their expertise that is live streamed and available to both mentee and the broader product community.
http://TheProductMentor.com
UserTesting Webinar - Mapping experiences: from insight to ActionUserTesting
Visualization diagrams are a key tool that help organizations compile business insights from a variety of sources and translate them into a universally-understandable format. Producing these diagrams requires empathy for customers, the ability to organize a wealth of information, and visual storytelling skills. In this webinar, Jim guides you in his tried-and-tested methods to create value through journeys, blueprints, and diagrams.
User Experience (UX) Research in HealthcareDan Berlin
Healthcare companies should embrace iterative user research so that they may design products that aligns with their customers' wants and needs. UX research studies are not clinical trials - they are a means of learn how to best design a product for customers.
Creating a culture that provokes failure and boosts improvementBen Dressler
Everyone fails - but not everyone uses failed attempts as a source of learning and improvement. This talk outlines a framework to turn failure into gaining knowledge by understanding IF, HOW and WHY something fails.
Introduction to usability and usability testing as a discipline, followed by how to do guerilla usability testing. Presented at Duke Tech Expo April 13, 2018 with co-author Lauren Hirsh, with content from a prior collaborative presentation of hers.
Slides Ian Multon recently used in his discussion w/ mentees of The Product Mentor.
The Product Mentor is a program designed to pair Product Mentors and Mentees from around the World, across all industries, from start-up to enterprise, guided by the fundamental goals…Better Decisions. Better Products. Better Product People.
Throughout the program, each mentor leads a conversation in an area of their expertise that is live streamed and available to both mentee and the broader product community.
http://TheProductMentor.com
UserTesting Webinar - Mapping experiences: from insight to ActionUserTesting
Visualization diagrams are a key tool that help organizations compile business insights from a variety of sources and translate them into a universally-understandable format. Producing these diagrams requires empathy for customers, the ability to organize a wealth of information, and visual storytelling skills. In this webinar, Jim guides you in his tried-and-tested methods to create value through journeys, blueprints, and diagrams.
As part of Biblefresh celebrations of the anniversary of the King James Bible this year, Wycliffe Bible Translators have run a series of evening classes, helping people to engage more with the Bible.
In November, Margaret Sim - a translation consultant working in Africa - spoke about irony and metaphor in the Bible, whether it's there and how we approach it. Her talk was entitled 'Does the Bible mean what it says?'
Reconsidering talent development in a connective eraCarmen Tschofen
Nurturing unusual learners often requires unusual educational approaches. Connective and personal learning offers different ways of thinking about learning processes and intents, especially for those who seek– and thrive in– complexity. Conversely, gifted education theory, developed for the "edges," may offer insights into how new and "edge" theories such as connectivism and personal learning can benefit all learners.
Education and policies for gifted students are based on past research and learning traditions. But are these ideas sufficient for anticipating and understanding what might come next for developing learners and ourselves? This session draws on futures (or “foresight”) studies to explore evolving contexts for understanding and supporting gifts, giftedness, and creative talent development in our rapidly shifting and complex environments.
Change Management Initiatives That Ensure Smooth Program Transition and Deliv...Chazey Partners
Developing and managing clear-cut, yet flexible change management program initiatives is essential to your Shared Services center’s short and long-term success.
By establishing strategic partnerships that encourage optimal communication and understanding between your faculty, departments, and stakeholders, change management can be effectively managed. By attending this session, you will learn how to:
Develop and maintain a flexible approach toward your change management programs – so to ensure continual improvement
Create the proper messaging, based upon your audience type and how to ensure message consistency
Develop and implement change management programs that will engage and excite your very diverse workforce
Incorporate a positive work environment that enhances work productivity and efficiency
Combining research on talent development, the development of expertise, and connectivist concepts such as complexity and learning networks, this presentation examines legacy assumptions about learning and suggests that new understandings might change our perceptions of what it means to be a "high ability learner."
I began an exploration of futures thinking and futures studies in 2005, and began a related, if undefined, study of learning as a form of cultural expression in 2006. This presentation was adapted and updated based on an early mash-up of these interests.
The visual design of surveys and other types of online data collection tools impacts how users perceive, understand and navigate the instrument as well as the responses they provide. Two key considerations that impact how users experience online data collection tools are the device they are using (e.g., smartphone, tablet, computer) and the method of interaction (e.g., website, app or both). When designing online data collection tools, creating a common user experience across different devices and methods of interaction is important to create a consistent user experience and to minimize measurement differences.
In this talk, we will compare the user experience across four different combinations of device and method of interaction of a survey: (1) desktop PC-website, (2) smartphone-app, (3) tablet-app, and (4) tablet-website. Through performance and eye-tracking data, we identify UX elements that must be unified across all devices as well as elements that might need customization for difference devices or methods of interaction.
Customer Journey Maps: Why and how UX practitioners use them or avoid themUXPA International
A panel of seasoned UX practitioners bring their individual experiences to the lively topic of customer journey mapping. Brief statements from each panelist shed light on their position, with topics including a new way to create a template for an interactive journey mapping experience, issues surrounding different parts of an organization using the same words to mean different things around visualizing customer experience, to techniques for creating this visualization technique with a co-located team, to the value of using the technique for visualizing workflows for a mobile app, and, on the flip side—why you shouldn’t do customer journey mapping, plus more! With lots of time for questions, this session will be highly interactive.
The tools used by the CRO masters round the world to optimise analytics, UX, VOC,insight and testing - all to optimise your insight or conversion figures.
On Monday, November 7, 2016, Smart Chicago Collaborative held the first CUTGroup Collective Community call. The goal of the CUTGroup Collective is to convene organizations and institutions in cities to help others establish new CUTGroups, create a new community, and share and learn from one another. For our first community call, we want to highlight CUTGroup Detroit’s story. Over the last few months, a collaboration across multiple entities invested in Detroit– the City of Detroit, Data Driven Detroit, and Microsoft– recruited for and conducted their first CUTGroup test. On our first call, the team involved will talk about their successes and challenges in building CUTGroup Detroit.
Slides were created by the CUTGroup Detroit team, which includes the City of Detroit, Data Driven Detroit, and Microsoft.
How well are you delivering your experience?Andrew Fisher
The web has always had fragmentation, though not on the scale we're seeing now with new devices - and that's before we consider hybrid-touch laptops, microscreen smart watches, gesture interfaces or displays the size of a wall. Testing all the user permutations of your application is becoming almost impossible, so how do you go about working out whether you're delivering a good experience or not?
In this session, we'll look at the use of responsive design oriented analytics coupled with a few statistical methods that will help determine how well you're delivering your experiences and highlighting the areas you need to focus on next in order to maintain a decent level of coverage.
Zen and the art of requirements gathering, why getting to "In time, On budget...Femke Goedhart
Often forgotten or trivialized, good requirements gathering can make or brake your project. This session will give you techniques and tips on how to effectively get to the core of the requirements, identify ways of prioritizing them and explains some core concepts of Functional and Technical design elements. Based on years of experience gathering requirements (and working with them!) Femke & Tim will take you through some of the real life examples they've come across and a lot of do's & don'ts they have run into. Tying them into practice and theory that can help you get your projects off to a better start.
This session was presented on March 30th 2015 in Gent Belgium during the http://Engage.ug usergroup event by Tim Clark & Femke Goedhart
This is the talk I gave at the 2019 Appium Conference in Bangalore, India. In this talk, I will go over the current challenges we face in today's development world, why we need more tools to help us keep pace, and cover how you can build your own crawler.
I've open sourced this tool and is available here for everyone to use: https://github.com/isonic1/Appium-Native-Crawler
See the video here => https://www.youtube.com/watch?v=u-gAn8bVbPg
Webinar: Everyone cares about sample quality but not everyone values it!Matt Dusig
On December 7, 2016, Mark Menig, Chief Executive Officer of TrueSample and Lisa Wilding-Brown, Chief Research Officer of Innovate MR explored various strategies to help research professionals navigate the challenging landscape of online sample quality. The webinar addressed:
• A brief overview of quality through the years. Where have we been and where are we going?
• What are current examples of online sample fraud (i.e., bots, hijackers, foreign click shops etc.)?
• What are the challenges and costs associated with today’s online fraud? How does online fraud impact data quality, specifically B2B research?
• What technical and behavioral strategies help to protect online research?
Webinar: Everyone cares about sample quality but not everyone values it!Matt Dusig
On December 7, 2016, Mark Menig, Chief Executive Officer of TrueSample and Lisa Wilding-Brown, Chief Research Officer of Innovate MR explored various strategies to help research professionals navigate the challenging landscape of online sample quality. The webinar addressed:
• A brief overview of quality through the years. Where have we been and where are we going?
• What are current examples of online sample fraud (i.e., bots, hijackers, foreign click shops etc.)?
• What are the challenges and costs associated with today’s online fraud? How does online fraud impact data quality, specifically B2B research?
• What technical and behavioral strategies help to protect online research?
User-Experience (UX) focuses on understanding what users' needs and value, and provide practical products or services. This human-computer interaction acts the same when the users are developers. This talk focuses on Developer-Experience (DX), to establish a good relationship between developers and platform or API providers.
For the past 20 years we have developed, designed, marketed with, for and at users. UX and usability are getting a lot of attention within that context. Beyond the buzzwords, we will look at two practices that shape our daily lives. What does it mean for an application to be usable and how's that part of creating an amazing user experience? This talk aims to discuss some actionable developer tips to ensure that your applications resonate with users.
Fundamentals of Lean UX, Agile on the Beach 2014Adrian Howard
Lean UX sits at the intersection of the Agile, Lean Startup & User Experience communities of practice.
This workshop will introduce you to the basics of the Lean UX approach, and take you through the process of applying Lean UX techniques at different stages of the product/business development process.
Learning outcomes:
* Lean UX and its relation to Lean Startup, Agile UX & general Lean
approaches the common myths and misunderstandings about Lean UX
* How to apply Lean UX approaches within your own company
* How the hypothesis/experiment model differs from traditional requirements
* How Lean UX can be used to understand customers better, discover new
product ideas, and reduce risk in new product development
Is This a Button? A Question Your Users Should Never Ask.Andrew Malek
Buttons are a primary way people interact with our websites and apps, but recent design trends have caused confusion over what actually is a button and what is static text or imagery. Perhaps you’re in QA and want to increase the quality of an app, a designer looking for usability tips, or a developer who normally thinks nothing more about a button other than instantiating one and placing it in a layout. Either way, learn about ideas you can test to possibly increase your app or website’s usage, guiding people to lead-generation or checkout activities. Topics include color theory, floating buttons, Fitts’s Law, microinteractions, and perceived performance.
What Can Performance Support Designers Learn from User Experience Designers?Jonathan Mann
Presentation from my session at the Performance Support Symposium, Sept 2014 in Boston, MA.
UX Designers create performance support regularly without actually calling it that. Designers of Performance Support can learn several things from UX designer's methods.
Any of these happen to you?
* Tasked to develop a user interface with an incomplete design spec, so had to make guesses such as where to position on-screen elements?
* Worked on a small team without a full-time designer, and requested to “just put a screen together for a demo”?
* Been asked to consult with a user interface designer, but don’t know what types of questions to pose?
Nowadays, everyone wants attractive, easy-to-use interfaces, so if you’re more comfortable sifting through Java or C# code than OmniGraffle or Visio mockups, learn about topics that can assist in creating more usable desktop applications, mobile apps, and websites. This talk provides easy-to-implement hints that can improve even a bad or “so-so” user interface. Areas of focus include the need for consistency; “negative space”; location, location, location (it’s crucial in screen real-estate, too!); contrasting colors; and the importance of action verbs.
Little Known Features of Research Suite (that will make your life easier!)stephchristensen15
Have you ever had one of those moments where you think to yourself, "How did I not know this before?" Join us for a fast-paced session as we uncover some of our favorite features to help you make bigger impact in your research.
Processing Speed and Vocabulary are Related to Older Adults' Internet Experie...Jennifer Romano Bergstrom
Some cognitive declines commonly occur with aging; yet they are seldom taken into account by Website designers and User Experience (UX) researchers. In this empirical study, we compared younger adults, middle-age adults, high-functioning older adults, and low-functioning older adults to examine whether there is a relationship between aspects of cognition
and performance when using a Website. Performance was measured by accuracy (percent of tasks completed successfully), efficiency (mean time to complete tasks) and self-rated satisfaction, three commonly used usability metrics. Results
suggest that processing speed and vocabulary may be related to Internet performance. Specifically, older adults with faster processing speed and/or high vocabulary may perform better than their lower-functioning counterparts. More importantly, these older adults perform similar to younger adults.
Whether surveys or forms are your final product or a part of your website/app, creating usable surveys and forms is crucial to a strong user experience for both the user entering information and the user receiving the information. In this session, you will learn about UX principles that drive a strong user experience when completing surveys/forms. The session will focus on understanding the key components of surveys/forms, what they are used for, and how to use them effectively. Topics include using labels to make forms and surveys easier, writing clear instructions, reducing respondent burden, and determining appropriate input fields such as check boxes versus radio buttons. Examples will include findings from usability studies and empirical research, some of which include eye tracking. Usability testing, eye tracking, and other user experience research methods will be discussed.
The way we ask questions and behave during UX sessions affects the data we collect and the interpretations of our findings. In order to collect good UX data, it is important for the moderator to be neutral, structured, and unbiased while setting a comfortable stage for participants to share their thoughts and reactions. In this interactive 45-minute session, you will learn about the importance of structured, unbiased methods to collecting user feedback. We will discuss different methods (e.g., in-lab testing, remote moderated/unmoderated testing, surveys, card sorting, focus groups) and pros/cons of each. You will learn about different data that can be collected from usability tests, including subjective (e.g., what participants verbalize about their experience), behavioral (e.g., what participants do) and implicit (e.g., what participants think but cannot explain) data. We will discuss how to ask participants questions in ways that do not introduce biases, and how additional methods, such as eye tracking, may be valuable in understanding the users’ experience. You will learn how to ensure the data we get from UX tests are reliable and valid.
User-Centered Research on the Paying for College Website and Tools - EDUI 2014Jennifer Romano Bergstrom
The Paying for College website is designed to help consumers make informed decisions about college finance. The Consumer Financial Protection Bureau (CFPB) began development with a user-centered design process for this tool-set, which is now in its 4th iteration. The college cost and financial aid comparison tool, a central feature of these resources, supports efforts by the Department of Education to standardize financial aid disclosures.
During this session, we’ll cover the most recent rounds of usability testing conducted with multiple groups across the U.S. We’ll highlight difficulties when designing and testing for multiple audiences with different needs as well as testing and iterating with live and prototype versions of the site. Data will also be emphasized as we share collection methodologies (click paths, eye tracking, questionnaires, etc.) and the importance of each. We’ll also provide insights into planning, execution, and reporting and how these findings informed major changes on the website.
This talk briefly covers usability and the user experience and then discusses posting to social media in a way that is consistent with how people use it.
Age-Related Differences in Search Strategy and Performance When Using a Data-...Jennifer Romano Bergstrom
Erica Olmsted-Hawala presented these findings at HCII 2013 in Las Vegas. Data are from a lab-based experimental usability study, in which we showed that older adults have greater difficulties with cognitively challenging tasks. However, even young adults have difficulties with complex data tables that are often found on government Web sites.
Jonathan Mendelson presented this talk at HCI in Las Vegas. Data are from a probability-based online panel with US adults over the age of 25. We found that QR Code awareness, knowledge, and usage were highest for young adults and lowest for older adults. See slides for more details and see Jonathan's blog post about this talk at: http://www.forsmarshgroup.com/index.php/blog/post/hcii-2013-preview-age-and-qr-codes
Caitlin Krulikowski presented this at ESOMAR 2013 in Boston. Results are based on a probability-based paper survey of American youth. Find out what youth think about and do on Pinterest... See Caitlin's blog post about this presentation here: http://www.forsmarshgroup.com/index.php/blog/post/esomar-pinterest-preview
Jon Strohl organized an Ignite session in which many of us "pitted" different UX methods against each other. In mine, I argue for why remote UX testing is the best UX method. (Of course, I like many methods, and each is "the best" in different situations, but for the sake of this presentation, I hd some fun...) See my blog post about this: http://www.forsmarshgroup.com/index.php/blog/post/uxpa-recap-part-ii-why-remote-testing-is-the-most-preferred-ux-method.
Beyond Eye Tracking: Using User Temperature, Rating Dials, and Facial Analysi...Jennifer Romano Bergstrom
Dan Berlin, Jon Strohl, David Hawkins and I presented this at UXPA 2013. Eye tracking is well known and accepted in the UX community. Here we present preliminary evidence for the usefulness of adding electrodermal activity (EDA), continuous dial ratings, etc. to user experience research.
Slides for a short course I taught for UXPA DC on Feb 27, 2013. This is a UX 101- basics if you are new to UX and Usability. The focus is on desktop websites, but many of these principles apply to other products (e.g., surveys, apps) and devices (e.g., tablets, smartphones). Stay tuned for an updated version that is mobile-heavy.
Typically survey pretesting involves separate timelines and research staffs for cognitive and usability testing. In this paper, we make the case that a more comprehensive and less labor-intensive approach to pretesting is to conduct both cognitive and usability testing concurrently. By testing the same questionnaire concurrently with respondents and interviewers (the users in this case), potentially problematic question wording and instrument design can be more efficiently identified in a way that can be used to improve the questionnaire for both the respondent and the interviewer.
In 2005 and 2006, the U.S. Census Bureau conducted separate rounds of cognitive and usability testing on an interviewer-administered non-response follow-up questionnaire in preparation for the 2010 Census. The usability testing was conducted in the Census Bureau’s Usability Lab with an early version of the instrument. Later, the Census Bureau’s Cognitive Lab conducted cognitive testing of the instrument. In doing the testing separately, we learned that in addition to usability issues, usability testing also identifies question wording issues, but that usability staff does not have the specialized experience (or sometimes the authority) to make recommendations in that arena. Similarly, while examining question wording, cognitive testing also identifies poor usability features, but the cognitive-testing staff lacks the experience with such testing to be able to recommend improvements in usability features. Based on this observation, in 2008, the Cognitive and Usability Labs at the Census Bureau conducted 40 cognitive and 20 usability interviews concurrently and in conjunction to test the questionnaire and presented results and recommendations from both types of testing together. When testing is conducted concurrently, staff from both labs, representing both specialties, can be at the table at once, creating a more efficient methodology. By examining these two case studies, this paper will discuss what can be gained by conducting these studies in concert above and beyond conducting them independently. Examples of the kinds of findings that are possible through this joint research and the synergy from having both research teams involved will be described.
Surveys are increasingly being conducted online, and it is pertinent to establish clear guidelines for presenting self-administered survey items on a computer screen. Goals should include reducing respondent burden and measurement error. Web survey designers often need to decide how to best present long lists of information and where to place navigation buttons (i.e., next and previous). We evaluated the usability of web-based National Survey of College Graduates (NSCG) questionnaire prototypes. In the first round of testing (n=8), respondents had difficulties proceeding through the survey because the ‘Next’ button was on the left side of the screen and the ‘Previous’ button was on the right. In the second round of testing (n=30), four versions of the survey were tested to assess usability of the ‘Next’ and ‘Previous’ buttons based on their placement on the screen (right or left side) and additionally, to assess the display format of a long list of occupation options (one-column versus two-columns). Dependent measures included participants’ comments from a think aloud protocol, self-reported ratings of satisfaction with the survey, responses to qualitative debriefing questions, time required to complete the job code item, and eye-tracking data focusing on which button (i.e., ‘Next’ or ‘Previous’) and which half of the job code list participants looked at first and more often. Qualitative and quantitative results revealed that participants preferred and performed quicker when the long list of occupation options was displayed in two columns rather than one and when ‘Next’ was displayed to the right of the ‘Previous’ button rather than vice versa. Our findings support usability best practice guidelines that recommend eliminating excessive scrolling on Web sites, and following reading conventions (i.e., looking to the right to move forward, as if turning a page in a book).
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Essentials of Automations: Optimizing FME Workflows with Parameters
So much UX data! Now what?
1. So much UX data! Now what?
Jennifer Romano Bergstrom
April 12, 2014
UXPA-DC Workshop| Arlington, VA
@romanocog
2. 2
Measuring the UX
• How does it work for the
end user?
• What does the user
expect?
• How does it make the
user feel?
• What is the user‟s story
and habits?
• What are the user‟s
needs?
Usability = “the extent
to which a product can
be used by specified
users to achieve
specified goals with
effectiveness, efficien
cy, and satisfaction in
a specified context of
use.” ISO 9241-11
+ emotions and
perceptions = UX
@romanocog @uxpadc
3. 3
Usability vs. User Experience (UX)?
The 5 Es to Understanding Users (W. Quesenbery): http://www.wqusability.com/articles/getting-started.html
User Experience Design (P. Morville): http://semanticstudios.com/publications/semantics/000029.php
Whitney‟s 5 Es of
Usability
Peter‟s User Experience
Honeycomb
@romanocog @uxpadc
4. 4
Krug, S. Don‟t Make Me Think
What People do on the Web @romanocog @uxpadc
5. 5
UX data
OBSERVATIO
NAL+ Ethnography
+ Time to complete task
+ Reaction time
+ Selection/click behavior
+ Ability to complete tasks
+ Accuracy
IMPLICIT
+ Facial expression analysis
+ Eye tracking
+ Electrodermal activity (EDA)
+ Behavioral analysis
+ Linguistic analysis of
verbalizations
+ Implicit associations
+ Pupil dilation
EXPLICI
T+ Post-task satisfaction
questionnaires
+ In-session difficulty ratings
+ Verbal responses
+ Moderator follow up
+ Real-time +/- dial
@romanocog @uxpadc
6. Explicit data
6
*Satisfaction Questionnaire Q7: Please rate how difficult it was to log in on this device. 1= not difficult at all to 5= extremely
difficult.
0
3
6
9
Participant Ratings
How likely would you be to recommend this site to a
friend?
Not likely at all or
Slightly likely
Moderately Likely
Very Likely
0
3
6
9
Participant Ratings
How likely would you be to use this site in
the future?
Not likely at all or
Slightly likely
Moderately Likely
Very Likely
“Love the picture in the middle of it.”
“It looks very clean and very simple.”
“It looks pretty organized, it's a nice design.”
When asked how they would save
information, four of six participants said
they would bookmark the page or take a
screenshot of the information. Only two
mentioned that they would use the site
functionality to save for later use.
83%
9%
9%
Percentage of Difficulty Ratings*
1 & 2
3
>=4
@romanocog @uxpadc
7.
8. 88
Implicit data
8
6
6.5
7
7.5
8
8.5
9
00:00.00 00:07.50 00:15.00 00:22.50 00:30.00 00:37.50 00:45.00 00:52.50
P29 iPhone
Begin date and
time selection.
End date and
time selection.
Trouble with scrolling. More trouble with scrolling.
@romanocog @uxpadc
9. 99
Implicit data
9
6
6.5
7
7.5
8
8.5
9
00:00.00 00:07.50 00:15.00 00:22.50 00:30.00 00:37.50 00:45.00 00:52.50
P29 iPhone
Begin date and
time selection.
End date and
time selection.
Trouble with scrolling. More trouble with scrolling.
@romanocog @uxpadc
15. Gaze plots and comments
15
M
“Man, this is a long paragraph.”
“There's a lot of information, it'd be a lot better in list form. Ideally, you want to
get your information quick without reading through all this.”
@romanocog @uxpadc
16. Usability test of a low-fi prototype
16
@forsmarshgroup @romanocog
@romanocog @uxpadc
• video
17. Usability test of a high-fi prototype
17
@forsmarshgroup @romanocog
@romanocog @uxpadc
• video
18. More data
18
Fixation count Heat map of the Round 2 home page
from 7 participants during the “scam” task
Left
navigation
Center
icons
Mean time
elapsed before
AOI is fixated
79.55 (18.54).
N = 8
29.36 (9.2).
N = 6
0
3
6
9
1
#ofParticipants
“How clear is the information on
this page?”**
Not at all to Slightly Clear
Moderately Clear
Very to Extremely Clear
@romanocog @uxpadc
19. 19
• “Where and how you click is a bit counter-intuitive. [It‟s] not
super obvious which button to click to get to next sections.”
• “I feel like the „Next‟ should be at the bottom and not the top.”Intuitive „Next‟ button location
Non-intuitive „Next‟ button location
Usability test across devices @romanocog @uxpadc
20. Combining data
20
• “I‟m not expecting them to email
or call me. I don‟t expect any
person to notify me.”
• “I‟m not sure when I‟d get an
answer. It‟s not like Yahoo!
Answers where it‟s immediate.”
Participants had different
expectations about what would
happen next.
• Five expected to hear back
via email
• One expected an immediate
response
• One expected to see the
answer posted somewhere
on the site
• Two said they would call for
assistance.
@romanocog @uxpadc
video
Round 2 prototypeHeat map of the Round 2 home page from 7 participants during the “scam” task to assess ease of navigation to the bundled page.
The location and design of buttons and features should be the same. There was a droid team, desktop team, and apple team, they worked independently so small inconsistencies like the location of the next button was different because communication wasn't the best.