This document provides an overview of Tennessee's Alternate Assessment Portfolio (TCAP-ALT) for students with significant disabilities. It discusses the components of the portfolio including content, choice, peer interaction, natural supports and inclusive settings. Examples are provided for properly documenting these components using evidence sheets, graphs and other tools. Guidelines are also reviewed for determining student participation eligibility, collecting and measuring student data according to alternate content standards.
Writing exam questions is one of the most important parts of teaching nursing. Having the right roadmap to what to include must be in the minds of nurse educators while developing those exams. This presentation provided directions on how to develop the test blueprint and how to revise questions.
Grading Your Assessments: How to Evaluate the Quality of Your ExamsExamSoft
How satisfied are you with the last assessment you gave? Would you describe your exam as a highly effective evaluation tool? How much information does it reveal about individual student’s abilities, and the overall performance of your current class as compared to previous classes? Do you trust your assessment to accurately identify which students “get it,” and which ones clearly do not grasp the content, nor meet the expected standards required to pass your course?
The use of a 3-step item analysis method based on an item’s difficulty levels, discrimination values, and response frequencies provides a revealing look at the quality of your assessment by focusing your attention on the effectiveness of each test item and its contribution to the exam blueprint. Save time and effort in identifying exactly which exam questions need editing, and how much editing is required, before you take any action. You’ll likely find that replacing the item with a brand new question may not be necessary. Learn how your efforts to make small improvements within just a few exam items, guided by a systematic process of reviewing statistical results before you start editing, can drastically enhance the items’ quality, and eliminate the need to spend hours rewriting the entire exam. By using this item analysis method, your future assessments will be able to provide an accurate measurement of your students’ abilities to apply nursing content and solve clinical problems.
Psychometrics 101: Know What Your Assessment Data is Telling YouExamSoft
Presented by Eric Ermie, Executive Director of Sales, ExamSoft Worldwide, Inc.
Keep it? Throw it out? Content/teaching issue? Bad question? Too easy? Too hard? What the heck? More than likely you have asked some or all of these questions at one point or another when trying to understand the performance of questions on an assessment. With differing opinions on how to interpret the statistics provided, how do you know what all this data is trying to tell you? Join us for a webinar on the fundamentals of item analysis, how the data is derived, and the different ways they can be interpreted. This presentation will cover how to put data into a useful context that will allow you to draw your own conclusions on what it means, how you should apply them, and why you should ignore rules that others may use for their specific situation.
2015 EDM Leopard for Adaptive Tutoring EvaluationYun Huang
This is the presentation for our paper in 2015 EDM: Gonzalez-Brenes, J. P., Huang, Y. Your model is predictive— but is it useful? Theoretical and Empirical Considerations of a New Paradigm for Adaptive Tutoring Evaluation. In: Proceedings of the 8th International Conference on Educational Data Mining (EDM 2015), Madrid, Spain, pp. 187-194.
Writing exam questions is one of the most important parts of teaching nursing. Having the right roadmap to what to include must be in the minds of nurse educators while developing those exams. This presentation provided directions on how to develop the test blueprint and how to revise questions.
Grading Your Assessments: How to Evaluate the Quality of Your ExamsExamSoft
How satisfied are you with the last assessment you gave? Would you describe your exam as a highly effective evaluation tool? How much information does it reveal about individual student’s abilities, and the overall performance of your current class as compared to previous classes? Do you trust your assessment to accurately identify which students “get it,” and which ones clearly do not grasp the content, nor meet the expected standards required to pass your course?
The use of a 3-step item analysis method based on an item’s difficulty levels, discrimination values, and response frequencies provides a revealing look at the quality of your assessment by focusing your attention on the effectiveness of each test item and its contribution to the exam blueprint. Save time and effort in identifying exactly which exam questions need editing, and how much editing is required, before you take any action. You’ll likely find that replacing the item with a brand new question may not be necessary. Learn how your efforts to make small improvements within just a few exam items, guided by a systematic process of reviewing statistical results before you start editing, can drastically enhance the items’ quality, and eliminate the need to spend hours rewriting the entire exam. By using this item analysis method, your future assessments will be able to provide an accurate measurement of your students’ abilities to apply nursing content and solve clinical problems.
Psychometrics 101: Know What Your Assessment Data is Telling YouExamSoft
Presented by Eric Ermie, Executive Director of Sales, ExamSoft Worldwide, Inc.
Keep it? Throw it out? Content/teaching issue? Bad question? Too easy? Too hard? What the heck? More than likely you have asked some or all of these questions at one point or another when trying to understand the performance of questions on an assessment. With differing opinions on how to interpret the statistics provided, how do you know what all this data is trying to tell you? Join us for a webinar on the fundamentals of item analysis, how the data is derived, and the different ways they can be interpreted. This presentation will cover how to put data into a useful context that will allow you to draw your own conclusions on what it means, how you should apply them, and why you should ignore rules that others may use for their specific situation.
2015 EDM Leopard for Adaptive Tutoring EvaluationYun Huang
This is the presentation for our paper in 2015 EDM: Gonzalez-Brenes, J. P., Huang, Y. Your model is predictive— but is it useful? Theoretical and Empirical Considerations of a New Paradigm for Adaptive Tutoring Evaluation. In: Proceedings of the 8th International Conference on Educational Data Mining (EDM 2015), Madrid, Spain, pp. 187-194.
Presentation made during the Intelligent User-Adapted Interfaces: Design and Multi-Modal Evaluation Workshop (IUadaptME) workshop conducted as part of UMAP 2018
Presentation made during the Intelligent User-Adapted Interfaces: Design and Multi-Modal Evaluation Workshop (IUadaptME) workshop conducted as part of UMAP 2018
Building Institutional Research Capacity in a K-12 Unified DistrictChristopher Kolar
In higher education, Institutional Research (IR) offices function to audit the academic output of the institution, evaluate program efficacy, and monitor student success. Effective institutional research supports the understanding, planning, and operation of programs informed by a recognition that different functions of an institution are interrelated and dependent. This session will outline practices by the Department of Research, Evaluation, and Assessment in the Palo Alto Unified School District – a division designed and staffed using an IR model.
for download this presentation please visit www.biowalesir.com
for more informative content watch Know_U YouTube channel
paper pencil test, Oral test, and Performance test
tools available for evaluation of overall personality
Predicting academic performance of an elementary school using attributes like class size, enrollment, poverty, parent education, student performance, teachers credentials from 400 elementary schools from the California Department of Education's API 2000 dataset
Local school board members are a key link between school districts and communities. They represent public concerns around testing and can hold district officials accountable. Given the critical role that local school boards play, Achieve and the National School Boards Association have developed “Assessment 101” resources for school board members. This professional development module is designed to:
· outline the critical role school boards play in supporting high quality assessment systems;
· introduce school board members to key assessment concepts and issues;
· provide an introduction to the Student Assessment Inventory for School Districts as a process to streamline testing and support limited, high-quality assessments for all students.
CERA 17: District Program Evaluation to Improve RTI/MTSSChristopher Kolar
Palo Alto Unified School District is a high performing district with substantial within district achievement gaps. The district conducted its first evaluation of Response to Intervention (RTI) practices and outcomes in 2016-17, in partnership between the Research, Evaluation, and Assessment (REA) Department and the Elementary Assistant Superintendent. This evaluation describes RTI practices in the district, examines student outcomes, and shares lessons learned and recommendations—for improving RTI/multi-tiered system of supports (MTSS) and for using program evaluation to create a culture of continuous improvement and collaboration in a district.
Learning analytics and accessibility – #calrg 2015Martyn Cooper
Presentation at the Open University's Computers and Learning Research Group (CALRG) Conference 2015 on Learning Analytics and Accessibility - detecting accessibility deficits with Learning Analytics approaches
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
Tcap alt portfolio and standards training
1. TCAP
ALT
PA
and
Alternate
Content
Standards
Krista
Bolen
Amy
Corbin
Cecilia
Franklin
2. Objec<ves
• Review
TCAP
ALT
Manual
and
Forms
• Review
Tennessee
Alternate
Content
Standards
• Discuss
data
collec<on
and
measurement
• Review
tools
and
resources
for
comple<ng
PorFolios
3. What
is
a
porFolio?
It
is
the
TCAP
test
for
students
with
significant
disabili<es.
It
has
28
pages,
it
contains
evidence
sheets
and
graphs.
It
documents
progress
over
<me.
It
is
scored
with
a
rubric.
Data
is
collected
on
Alternate
Standards.
6. Black
Binder
• comes
from
MNPS
Group
Tes<ng
• is
for
mul<ple
students
• is
for
your
data
and
forms
• is
considered
secure
test
material.
• cannot
be
removed
from
school
or
taken
home
• can
be
kept
from
year
to
year
• needs
to
be
in
a
secure
place
in
the
classroom
White
Binders
• comes
from
the
State
of
TN
• are
for
individual
students
• come
in
a
box
that
must
be
kept
and
have
a
barcode
• are
secure
test
materials
and
cannot
be
removed
from
the
school
• are
turned
in
to
Group
Tes<ng
and
sent
back
to
Measurement
Inc.
8. Rubric
Determina<on
• Three
types:
• Regular
• Modified
• Must
enter
Report
of
Irregularity
On-‐Line
(must
be
completed
with
knowledge
of
TCAP-‐alt
tes<ng
coordinator)
• Homebound
• Must
enter
Report
of
Irregularity
On-‐Line
(must
be
completed
with
knowledge
of
TCAP-‐alt
tes<ng
coordinator)
9. Student
Performance/AYP
• This
assessment
is
JUST
as
IMPORTANT
as
the
TCAP
that
other
students
take!
• If
a
student
does
not
take
ANY
TCAP
assessment,
it
counts
as
both
Below
Proficient
and
Non
Par<cipa<ng.
• Data
from
TCAP-‐alt
porFolios
are
counted
towards
TVASS
and
does
reflect
in
the
new
Teacher
Evalua<on
System
and
is
in
the
Data
Warehouse
10. Par<cipa<on
Guidelines
• The IEP team must decide on portfolio assessment and all
paperwork signed before data can be collected.
• In the case of a single eligibility-Functional Delay-It is an IEP
team decision. Items to be discussed:
o in past testing, all accommodations, MAAS participation
did not yield proficient scores
o the student's curriculum is based on alternate standards
o a score will be generated but not reported to the Feds
• The school psychologist does not have to agree but does
have to complete the first section on the Participation
Guidelines.
16. Anatomy
of
an
Alternate
Standard
Content
Area:
Math
Content
Standard:
Geometry
ALE: G.2. Specify locations and describe spatial
relationships
API:
G.2.3Identify parallel and intersecting lines.
18. Content
50 points
Three different Content
Standards with a minimum of 1
activity related to the Alternate
Performance Indicators per
content standard are evident. At
least 15 occurrences of data
collection and graphing showing
progress documented
throughout the data period for
each Alt Performance Indicator
assessed
19. Choice
20 points
.
At least 3 types of Choice
evidenced and related to at least 3
activities.
(See TCAP-Alt Teacher’s Manual
for examples of Choice types)
20. Choice
• Five different choices to select from- Where, When, Who, Reward , Materials.
• Choice must be logical to activity.
• Food
choices
are
only
appropriate
if
related
to
ac<vity
(e.g.,
purchasing
food
from
vending
machine
related
to
math
ac<vity
involving
coins).
• Reward
choices
involving
food
are
not
appropriate.
• More details on page 37.
22. Notes About Settings
• Setting is where the data point occurred.
• Inclusive settings should be where students
in general education are too.
• Academic inclusive settings (biology, social
studies, literacy block) can be counted/used
numerous times.
• Specialty Areas (gym, music, art, computer
lab) can only be counted once per Content
Area.
• Settings should be
logical.
• For more details...
page 37-39.
24. Natural
Support
A
person
who
is
available
to
all
students
in
the
school.
*You
must
have
an
inclusive
seQng
to
have
a
natural
support
signature.
25. Natural
Support
IS
IS
NOT
Only
in
a
general
educa<on
seQng
Special
Educa<on
staff
An
adult
A
peer
Available
to
all
students,
with
and
without
IEPs
Just
for
students
with
IEPs
librarian, PE coach, art
teacher, 5th grade science
teacher
a paraprofessional
Natural Support person was involved in the activity. More
details on page 40 and 41.
26. Peer Interactions
10 points
Student interactions
with peers are related
to the Alternate
Performance Indicators
assessed under 3
different content
standards.
27. Peer Interactions
• Interaction relates to activity and interaction is
described.
• Appropriate peers are NOT taking TCAP-Alt.
• Peers are age appropriate.
• Peers sign their first name only.
• Two places to
document
Peer Interaction,
but only one
is needed.
• More details on
page 42.
28.
29. Graphs
• Four
graph
choices
• Graph
1
• Most
appropriate
for
lowest
func<oning
students
• Graph
2
• Most
appropriate
for
prerequisite
skills
• Graph
3
• Most
versa<le
• Graph
4
• Most
suitable
for
task
analysis
30. Prompt
Data
(graph
1)
• Criteria
MUST
be
defined.
• Levels
of
promp<ng
(or
prompt
hierarchy)
must
be
used
consistently.
• Defini<ons
of
prompt
levels
in
the
TCAP
ALT
manual
(pg
33)…..you
should
use
these
defini<ons
if
you
plan
to
use
this
graph!
32. Percent
correct
(graph
3)
• Most
flexible
and
versa<le
form
of
data
collec<on.
• Allows
for
different
numbers
of
trials
(minimum
of
5
recommended)
• Task
must
be
defined
clearly
so
that
“correct”
is
not
subjec<ve.
• Can
be
easily
adapted
to
any
subject,
task,
or
seQng.
34. Graphs
• Student
Name,
Grade
Level,
Content
Area,
Standard,
ALE,
and
API
should
be
the
same
as
the
evidence
sheet
• For
each
day
data
is
collected
for
the
API,
write
date
in
box
on
the
top
line
of
graph
• Highlight
date
column
on
graph
which
corresponds
to
date
on
evidence
sheet
• Student
performance
is
indicated
by
a
dot
in
the
CENTER
of
the
data
day
• Use blue ink.
37. Discrete
Trial
vs.
TCAP
Alt
Recep<ve
leler
ID
Touch
A
Touch
A
w/1
distractor
Touch
A
w/2
distractor
Touch
B
etc.......
Recognize
lowercase
leEers
Touch
10
lelers....
38. Defini<on
of
progress....
pg
35
"Progress is defined as at least 3
occurrences during which the student
performed at least 2 increments above
the lowest point. The occurrences must
be after the FIRST occurrence of the
lowest point. Data cannot flatline for
more than 5 points at ANY level."
42. Scores
• Scores
are
reported
at
end
of
school
year
• PorFolios
are
Below
Basic,
Basic,
Proficient,
and
Advanced
• All
scores
are
blended
into
the
school’s
AYP
scores
• Nonpar<cipa<ng
students
(func<onally
delayed)
will
count
against
the
school
• Scores
are
reported
in
the
Data
Warehouse
43. Important
Items
to
Remember
Blue ink is required for all
handwritten items.
Yellow highlighter is preferred.
Dot must be in the center of the
square.
Do not whiteout signatures, dates, or
data points
Work sessions are a great way to get
feedback and support.
Online Resources are available to
make your life easier.....
44. Web
resources
state
website
hlp://www.tn.gov/ehlp://filefolderfun.com/
FileFolderGames.htmlduca<on/assessment/
TCAP-‐AltPorFolio.shtml
Task
Galore
hlp://www.tasksgalore.com/
Shoe
Box
tasks
hlp://www.shoeboxtasks.com/
File
folder
ac<vi<es
hlp://filefolderfun.com/FileFolderGames.html
45. Web
resources
Blogs
hlp://theau<smteacher.blogspot.com/
2009/10/file-‐folder-‐ac<vi<es.html
Good
author-‐Diane
Browder
hlp://products.brookespublishing.com/
cw_contributorinfo.aspx?
ContribID=191&Name=Diane+M.+Browder
+Ph.D.
46. Ques<ons/Resources
• Tennessee
Department
of
Ed
website-‐go
to
assessments
page
• Talk
to
your
instruc<onal
facilitator,
compliance
facilitator
• Cecilia.Franklin@mnps.org