Data set options allow features during dataset processing and control variables, observations, security, and attributes. They are specified in parentheses after a SAS data set name and include options like DROP, KEEP, RENAME, FIRSTOBS, and LABEL. Data set options apply to input datasets before programming statements and to output datasets after statements are processed.
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
SAS Programming For Beginners | SAS Programming Tutorial | SAS Tutorial | SAS...Edureka!
This SAS Programming For Beginners tutorial from Edureka will take you through the programming concepts in SAS such as data and procedure steps, formats, informats, loops, dataset operations and important procedures like Proc Means, Frequency, Summary and many more. We have implemented a Randomness Testing demo which uses SAS Frequency procedure and Chi Square test to check the randomness of a given sample of data. Below are the topics covered in this tutorial:
1. Data Analytics Tools
2. Why SAS?
3. What is SAS?
4. SAS Features
5. Programming Concepts in SAS
6. Use Case – Testing Randomness
7. SAS Job Trends
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
SAS Programming For Beginners | SAS Programming Tutorial | SAS Tutorial | SAS...Edureka!
This SAS Programming For Beginners tutorial from Edureka will take you through the programming concepts in SAS such as data and procedure steps, formats, informats, loops, dataset operations and important procedures like Proc Means, Frequency, Summary and many more. We have implemented a Randomness Testing demo which uses SAS Frequency procedure and Chi Square test to check the randomness of a given sample of data. Below are the topics covered in this tutorial:
1. Data Analytics Tools
2. Why SAS?
3. What is SAS?
4. SAS Features
5. Programming Concepts in SAS
6. Use Case – Testing Randomness
7. SAS Job Trends
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Aan introduction to SAS, one of the more frequently used statistical packages in business. With hands-on exercises, explore SAS's many features and learn how to import and manage datasets and and run basic statistical analyses. This is an introductory workshop appropriate for those with little or no experience with SAS.
Complete workshop materials include demo SAS programs available at http://projects.iq.harvard.edu/rtc/sas-intro
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
SDTM (Study Data Tabulation Model) defines a standard structure for human clinical trial (study) data tabulations and for nonclinical study data tabulations that are to be submitted as part of a product application to a regulatory authority such as the United States Food and Drug Administration (FDA).
Understanding SAS Data Step Processingguest2160992
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
This presentation is about -
Overview of SAS 9 Business Intelligence Platform,
SAS Data Integration,
Study Business Intelligence,
overview Business Intelligence Information Consumers ,navigating in SAS Data Integration Studio,
For more details Visit :-
http://vibranttechnologies.co.in/sas-classes-in-mumbai.html
SDTM (Study Data Tabulation Model) defines a standard for organizing and formatting data to streamline processes in collection, management, analysis and reporting of human clinical trial data tabulations and for non-clinical study data tabulations which are to be submitted as part of a product application(IND and NDA) to a regulatory authority such as the United States Food and Drug Administration (FDA) and PMDA (Japan)
The purpose of this presentation is to describe step by step the transition of a SAS Programmer into a Clinical Statistical Programmer. It can be used as guidelines for SAS Programmers who wants to put their programming and technical expertise into industries.
A SAS Programmer is someone who uses SAS software for different scenarios. The person who uses it for different purposes is known as a SAS Programmer.
On the other hand, a Clinical Statistical Programmer performs all the procedures to generate future outputs and makes advanced and real-world developments to face further challenges. A primary role of Clinical Statistical Programmers is to use their technical and programming skills in order to enable clinical trial statisticians to perform their statistical analysis duties more efficiently.
This presentation will briefly discuss about the smooth transition that a SAS Programmer needs to go through in order to become a Clinical Statistical Programmer.
htttps://www.smartprogram.in/sas
Learn SAS programming, SAS slides, SAS tutorials, SAS certification, SAS Sample Code, SAS Macro examples,SAS video tutorials, SAS ebooks, SAS tutorials, SAS tips and Techniques, Base SAS and Advanced SAS certification, SAS interview Questions and answers, Proc SQL, SAS syntax, Advanced SAS
Return of the Codes -- SAS', Windows' and Your'sMark Tabladillo
Robust applications engage in the give-and-take discussion between commands and return codes. This presentation
encourages applications developers to implement comprehensive return code processing. This paper presents three
distinct categories. First, we consider return codes from SAS®. Second, we consider return codes from the
Windows® (as an example operating system). Third, we discuss development principles for proactively writing your
own return messages. The examples draw from SAS/AF® and Windows®, and affect all SAS applications
development (including robust SAS Macro development). This paper introduces the rich conversation an application
can and should have with its environment. Special attention focuses on error messages and recovering gracefully
from unexpected or unintentional results.
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Aan introduction to SAS, one of the more frequently used statistical packages in business. With hands-on exercises, explore SAS's many features and learn how to import and manage datasets and and run basic statistical analyses. This is an introductory workshop appropriate for those with little or no experience with SAS.
Complete workshop materials include demo SAS programs available at http://projects.iq.harvard.edu/rtc/sas-intro
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
SDTM (Study Data Tabulation Model) defines a standard structure for human clinical trial (study) data tabulations and for nonclinical study data tabulations that are to be submitted as part of a product application to a regulatory authority such as the United States Food and Drug Administration (FDA).
Understanding SAS Data Step Processingguest2160992
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
This presentation is about -
Overview of SAS 9 Business Intelligence Platform,
SAS Data Integration,
Study Business Intelligence,
overview Business Intelligence Information Consumers ,navigating in SAS Data Integration Studio,
For more details Visit :-
http://vibranttechnologies.co.in/sas-classes-in-mumbai.html
SDTM (Study Data Tabulation Model) defines a standard for organizing and formatting data to streamline processes in collection, management, analysis and reporting of human clinical trial data tabulations and for non-clinical study data tabulations which are to be submitted as part of a product application(IND and NDA) to a regulatory authority such as the United States Food and Drug Administration (FDA) and PMDA (Japan)
The purpose of this presentation is to describe step by step the transition of a SAS Programmer into a Clinical Statistical Programmer. It can be used as guidelines for SAS Programmers who wants to put their programming and technical expertise into industries.
A SAS Programmer is someone who uses SAS software for different scenarios. The person who uses it for different purposes is known as a SAS Programmer.
On the other hand, a Clinical Statistical Programmer performs all the procedures to generate future outputs and makes advanced and real-world developments to face further challenges. A primary role of Clinical Statistical Programmers is to use their technical and programming skills in order to enable clinical trial statisticians to perform their statistical analysis duties more efficiently.
This presentation will briefly discuss about the smooth transition that a SAS Programmer needs to go through in order to become a Clinical Statistical Programmer.
htttps://www.smartprogram.in/sas
Learn SAS programming, SAS slides, SAS tutorials, SAS certification, SAS Sample Code, SAS Macro examples,SAS video tutorials, SAS ebooks, SAS tutorials, SAS tips and Techniques, Base SAS and Advanced SAS certification, SAS interview Questions and answers, Proc SQL, SAS syntax, Advanced SAS
Return of the Codes -- SAS', Windows' and Your'sMark Tabladillo
Robust applications engage in the give-and-take discussion between commands and return codes. This presentation
encourages applications developers to implement comprehensive return code processing. This paper presents three
distinct categories. First, we consider return codes from SAS®. Second, we consider return codes from the
Windows® (as an example operating system). Third, we discuss development principles for proactively writing your
own return messages. The examples draw from SAS/AF® and Windows®, and affect all SAS applications
development (including robust SAS Macro development). This paper introduces the rich conversation an application
can and should have with its environment. Special attention focuses on error messages and recovering gracefully
from unexpected or unintentional results.
Data Sets You Free: Analytics for Content StrategyJonathon Colman
The value of content strategy is hard to measure and even harder to forecast. For many content strategists, the hardest part of the job isn't even the content strategy work itself! It's getting your hands untied so that you can help the organization or client take action.
Data is what sets you free. By learning how to tell the story of your content and audience with data, you'll be able to move onward from just TALKING about content strategy to actually DOING it!
Included in this Content Strategy presentation:
1. WHAT data and analytics mean for our content as well as to our partners, colleagues, and clients
2. WHY content strategists should value and use data and analytics in their work
3. HOW to use the Excellent Analytics plug-in for Microsoft Excel to automate the inclusion of Google Analytics data in a content audit (no more cutting and pasting!)
4. HOW to use conditional formatting in Microsoft Excel to make opportunities for improving content pop
5. HOW to visualize and format data and analytics as you report out the results of your content strategy work, including 8 key learnings from Edward Tufte.
If you're ready to start using more data in your content strategy work, then this presentation includes actionable tactics, tools, and links to more information. Remember: you are what you measure - so start measuring the impacts of your work now!
Originally presented at Confab: The Content Strategy Conference in Minneapolis, MN on June 4, 2013.
For more information and tools that you can use to set your content free, see this associated blog post on the Confab Events blog: http://confabevents.com/blog/data-sets-you-free
You can learn more about Jonathon Colman at http://www.jonathoncolman.org/
Also see 200+ free Content Strategy resources at http://www.jonathoncolman.org/2013/02/04/content-strategy-resources/
Vibrant Technologies is headquarted in Mumbai,India.We are the best SAS training provider in Navi Mumbai who provides Live Projects to students.We provide Corporate Training also.We are Best Statistical Analysis System classes in Mumbai according to our students and corporates
Sas classes in mumbai
best Sas classes in mumbai with job assistance.
our features are:
expert guidance by it industry professionals
lowest fees of 5000
practical exposure to handle projects
well equiped lab
after course resume writing guidance
The slide shows a full gist of reading different types of data in R thanks to coursera it was much comprehensive and i made some additional changes too.
Midway in our life's journey, I went astray from the straight imperative road and woke to find myself alone in a dark declarative wood.
My guide out of this dark declarative wood was a familiar friend, SQL, who showed me the way to wrap a context of a window to push through using Window Functions to escape the Inferno.
Next I found myself somewhere in-between running up hill with one foot in front of the other advancing so as the leading foot was always above the ground running with my friend LINQ, I was able to wrap the context of a collection around my data to advance my journey through Purgatorio.
My last guide into the blinding brilliant light of Paradiso was from the Dutch Caribbean, who taught me how to wrap my computations into a context and move my data through leading me into brilliant bliss.
Join me on my divine data comedy.
How to find low-cost or free data science resources 202006Mark Tabladillo
There are many free or low-cost resources to become better trained in data science. None of these options equals a formal degree: but short of that scope, these other resources are helpful at least for keeping up with technology. This presentation will provide specific recommendations on free or low-cost resources based on the Team Data Science Process framework (business understanding, data engineering, modeling, deployment).
This presentation covers some of the major data science and AI announcements from the May 2020 Microsoft Build conference. Included in this talk are 1) Azure Synapse Link, 2) Responsible AI, 3) Project Bonsai & Project Moab, and 4) AI Models at Scale (deep learning with billions of parameters).
Microsoft has released Automated ML technologies for developers through ML.NET, Azure ML Service, and Azure Databricks. This presenter is a data scientist and Microsoft architect, and will give a comprehensive overview of the utility and use case of this automated technology for production solutions. The presentation includes code you can try now.
Automated machine learning (automated ML) automates feature engineering, algorithm and hyperparameter selection to find the best model for your data. The mission: Enable automated building of machine learning with the goal of accelerating, democratizing and scaling AI. This presentation covers some recent announcements of technologies related to Automated ML, and especially for Azure. The demonstrations focus on Python with Azure ML Service and Azure Databricks.
ML.NET 1.0 release is the first major milestone of a great journey that started in May 2018 when we released ML.NET 0.1 as open source. ML.NET is an open-source and cross-platform machine learning framework for .NET developers. Using ML.NET, developers can leverage their existing tools and skillsets to develop and infuse custom AI into their applications by creating custom machine learning models for common scenarios like Sentiment Analysis, Recommendation, Image Classification and more.
This presentation provides an overview of the technology with demos run in a Deep Learning Virtual Machine running Windows Server 2016. Code examples are in C# and F# and run in Visual Studio Community 2019. This technology is ready for production implementation and runs on .NET Core.
This presentation is the first of four related to ML.NET and Automated ML. The presentation will be recorded with video posted to this YouTube Channel: http://bit.ly/2ZybKwI
Automated machine learning (automated ML) automates feature engineering, algorithm and hyperparameter selection to find the best model for your data. The mission: Enable automated building of machine learning with the goal of accelerating, democratizing and scaling AI.
This presentation covers some recent announcements of technologies related to Automated ML, and especially for Azure. The demonstrations focus on Python with Azure ML Service and Azure Databricks.
This presentation is the fourth of four related to ML.NET and Automated ML. The presentation will be recorded with video posted to this YouTube Channel: http://bit.ly/2ZybKwI
NimbusML enables data scientists to use ML.NET to train models in Azure Machine Learning or anywhere else they use Python. NimbusML provides state-of-the-art ML algorithms, transforms and components, aiming to make them useful for all developers, data scientists, and information workers and helpful in all products, services and devices. The components are authored by the team members, as well as numerous contributors from MSR, CISL, Bing and other teams at Microsoft. NimbusML is interoperable with scikit-learn estimators and transforms, while adding a suite of highly optimized algorithms written in C++ and C# for speed and performance.
The trained machine learning model can be used in a .NET application with ML.NET. This presentation will outline the features of NimbusML and provide a notebook-based demonstration using Azure Notebooks.
This presentation is the third of four related to ML.NET and Automated ML. The presentation will be recorded with video posted to this YouTube Channel: http://bit.ly/2ZybKwI
201906 02 Introduction to AutoML with ML.NET 1.0Mark Tabladillo
ML.NET 1.0 release is the first major milestone of a great journey that started in May 2018 when we released ML.NET 0.1 as open source. ML.NET is an open-source and cross-platform machine learning framework for .NET developers. Using ML.NET, developers can leverage their existing tools and skillsets to develop and infuse custom AI into their applications by creating custom machine learning models for common scenarios like Sentiment Analysis, Recommendation, Image Classification and more.
“Automated ML” is a collection of new technologies from Microsoft to enhance the data science development process. Still in preview, Auto ML for ML.NET 1.0 will be demonstrated in a Deep Learning Virtual Machine running Windows Server 2016. Code examples are in C# and run in Visual Studio Community 2019.
This presentation is the second of four related to ML.NET and Automated ML. The presentation will be recorded with video posted to this YouTube Channel: http://bit.ly/2ZybKwI
This presentation focuses on the value proposition for Azure Databricks for Data Science. First, the talk includes an overview of the merits of Azure Databricks and Spark. Second, the talk includes demos of data science on Azure Databricks. Finally, the presentation includes some ideas for data science production.
201905 Azure Certification DP-100: Designing and Implementing a Data Science ...Mark Tabladillo
Microsoft has several Azure certifications including DP-100 (Designing and Implementing a Data Science Solution on Azure). Until this month, the exam had been in beta: however, the presenter has just passed the exam (first try). The purpose of this event is to share a viewpoint on how to study for the exam. Today, there are multiple ways to develop and deliver and deploy R or Python or Spark or deep learning models on Azure. The differences are important for this exam.
Big Data Advanced Analytics on Microsoft Azure 201904Mark Tabladillo
This talk summarizes key points for big data advanced analytics on Microsoft Azure. First, there is a review of the major technologies. Second, there is a series of technology demos (focusing on VMs, Databricks and Azure ML Service). Third, there is some advice on using the Team Data Science Process to help plan projects. The deck has web resources recommended. This presentation was delivered at the Global Azure Bootcamp 2019, Atlanta GA location (Alpharetta Avalon).
This presentation anchors best practices for Enterprise Data Science based on Microsoft's "Team Data Science Process". The talk includes introducing the concepts, describing some real-world advice for project planning, and discusses typical titles of professionals who make enterprise data science successful. These techniques also apply for AI (artificial intelligence), deep learning, machine learning, and advanced analytics.
Training of Python scikit-learn models on AzureMark Tabladillo
This intermediate-level presentation covers latest Azure technology for deploying Python sci-kit models on Azure. The presentation is a demo using a Microsoft Data Science Virtual Machine (DSVM), Visual Studio Code, Azure Machine Learning Service, Azure Machine Learning Compute, Azure Storage Blobs, and Azure Container Registry to train a model from a Python 3 Anaconda environment.
The presentation will include an architectural diagram and downloadable code from Github.
YouTube recording at https://www.youtube.com/watch?v=HyzbxHBpAbg&feature=youtu.be
Big Data Adavnced Analytics on Microsoft AzureMark Tabladillo
This presentation provides a survey of the advanced analytics strengths of Microsoft Azure from an enterprise perspective (with these organizations being the bulk of big data users) based on the Team Data Science Process. The talk also covers the range of analytics and advanced analytics solutions available for developers using data science and artificial intelligence from Microsoft Azure.
Power BI has become an increasingly important data analytics tool. This presentation focuses on the advanced analytics options currently available in Power BI. Attendees to this talk will see:
· Microsoft’s perspective on advanced analytics development: the Team Data Science Process
· What the general options are for advanced analytics on Azure
· What the specific native advanced analytics capabilities are in Power BI
· Some ideas on pairing Power BI with other technologies in advanced analytics architectures
Microsoft Cognitive Toolkit (Atlanta Code Camp 2017)Mark Tabladillo
The Microsoft Cognitive Toolkit (CNTK) is a unified deep-learning toolkit that describes neural networks as a series of computational steps via a directed graph. In this directed graph, leaf nodes represent input values or network parameters, while other nodes represent matrix operations upon their inputs.
The objectives of this presentation is to 1) describe what CNTK is, 2) present a comparative evaluation with similar technologies, 3) outline potential applications, and 4) demonstrate the technology with Jupyter Python examples.
Machine learning services with SQL Server 2017Mark Tabladillo
SQL Server 2017 introduces Machine Learning Services with two independent technologies: R and Python. The purpose of this presentation is 1) to describe major features of this technology for technology managers; 2) to outline use cases for architects; and 3) to provide demos for developers and data scientists.
Microsoft Technologies for Data Science 201612Mark Tabladillo
Delivered to SQL Saturday BI Edition -- Atlanta, GA
Microsoft provides several technologies in and around Azure which can be used for casual to serious data science. This presentation provides an overview of the major Microsoft options for both on-premise and cloud-based data science (and hybrid). These technologies have been used by the presenter in various companies and industries, both as a Microsoft consultant and previously independent consultant. As well, the speaker provides insights into data science careers, information which helps imply where the business will likely be for consultants and partners.
How Big Companies plan to use Our Big Data 201610Mark Tabladillo
Underneath the shiny popular apps on tablets, smartphones, and entertainment channels are typically large cloud-based data centers. App developers leverage the cloud to provide advertisers with targeted sales opportunities, which has been accounting for an ongoing shift from paper to online media. This presentation will provide updated trends and statistics for 2016 on big data usage (based on consumer use), statistical concerns with big data, and the Microsoft big data story.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Epistemic Interaction - tuning interfaces to provide information for AI support
Introduction to SAS Data Set Options
1. Introduction to
Data Set Options
Mark Tabladillo, Ph.D.
Software Developer, MarkTab Consulting
Associate Faculty University of Phoenix
Faculty,
January 30, 2007
2. Introduction
• Data set options allow features during
dataset processing
• Most SAS data set options can apply to
either input or output SAS data sets in
DATA steps or procedure (PROC) steps
• Data set options allow the data step to
control variables, observations, security,
t l i bl b ti it
and data set attributes
3. Outline
• Define data set options
• Provide examples in four categories
• Di
Discuss d t set processing rules
data t i l
4. Outline
• Define data set options
• Provide examples in four categories
• Di
Discuss d t set processing rules
data t i l
5. Definition
• Data set options specify actions that
apply only to the SAS data set with which
they appear
appear.
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
6. Syntax
• Specify a data set option in parentheses
after a SAS data set name. To specify
several data set options separate them
options,
with spaces.
(option-1=value-1<...option-n=value-n>)
(option 1=value 1< option n=value n>)
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
7. Outline
• Define data set options
• Provide examples in four categories
• Di
Discuss d t set processing rules
data t i l
8. Quick Examples
• Data set options enable us to perform
operations such as these:
– Renaming variables
– Selecting only the first or last n observations
for processing
– Dropping variables from processing or from
the output data set
– Specifying a password for a data set
– Adding dataset labels
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
10. Examples Dataset
data work.sales (drop=i randomState);
length state $2 sales 8 randomState 3;
do i = 1 to 2500;
randomState = round(rand('gaussian',3,1)+0.5);
if randomState in (1,2,3,4,5) then do;
select(randomState);
l t( d St t )
when(1) state='TN';
when(2) state='AL';
when(3) state='GA';
( ) ;
when(4) state='FL';
when(5) state='MS';
end;
sales = int(rand('gaussian' 1000000 500000));
int(rand('gaussian',1000000,500000));
output work.sales;
end;
end;
run;
11. List of Common Options
SAS Data Set Option Description
Variable DROP= Data Set Excludes variables from
Control Option processing or from output SAS
data sets
KEEP= Data Set Specifies variables for processing
Option
p or for writing to output SAS data
g p
sets
RENAME= Data Set Changes the name of a variable
Option
O i
http://support.sas.com/onlinedoc/913/getDoc/en/lrdict.hlp/a000104210.htm
12. Examples: Variable Control
data work salesReformat;
work.salesReformat;
set work.sales (drop=sales);
run;
data work.salesReformat2;
set work.sales (keep=state);
run;
proc sort data=work.sales (rename=(state=salesState))
out=work.salesReformat3 (drop=sales);
by salesState;
run;
13. List of Common Options
SAS Data Set Option Description
Observation FIRSTOBS= Data Set Specifies which observation SAS
Control Option processes first
IN= Data Set Option Creates a variable that indicates
whether the data set contributed
data to the current observation
OBS= Data Set Option Specifies when to stop processing
obse a o s
observations
WHERE= Data Set Selects observations that meet the
Option specified condition
http://support.sas.com/onlinedoc/913/getDoc/en/lrdict.hlp/a000104210.htm
14. Examples: Observation Control
* (obs - firstobs) + 1 = results;
data work.selectObs1;
set work.sales (firstobs=1 obs=200);
( );
run;
data work.selectObs2;
set work.sales (firstobs=200 obs=400);
( );
run;
proc print data=work.sales (obs=25);
run;
;
proc freq data=work.sales (firstobs=1);
tables state;
run;;
proc means data=work.sales (obs=max);
class state;
;
var sales;
run;
15. Examples: Observation Control
data work combineObs1;
work.combineObs1;
set work.selectObs1 (in=in1) work.selectObs2 (in=in2);
length source $12;
if in1 then source = 'Dataset One';
else if in2 then source = 'Dataset Two';
run;
data work combineObs2;
work.combineObs2;
set work.selectObs1 (in=in1) work.selectObs2 (in=in2);
if in1 and in2 then output;
run;
16. List of Common Options
SAS Data Set Option Description
Security ALTER= Data Set Option Assigns an alter password to a SAS file and
enables access to a password-
protected SAS file
ENCRYPT= Data Set Encrypts SAS data files
Option
PW= Data Set Option Assigns a read, write, or alter password to a
SAS file and enables access to a
password-protected SAS file
READ= Data Set Option Assigns a read password to a SAS file and
enables access to a read-protected
SAS file
WRITE= Data Set Option Assigns a write password to a SAS file and
enables access to a write-protected
SAS file
http://support.sas.com/onlinedoc/913/getDoc/en/lrdict.hlp/a000104210.htm
17. Examples: Security
data work.secure1 (alter=NoErrors);
set work.sales;
run;
data work.secure2;
set work.sales (alter=NoErrors);
work sales
run;
* Note: A SAS password does not control access to a SAS file beyond the SAS
system.
system You should use the operating system-supplied utilities and file-system
system supplied file system
security controls in order to control access to SAS files outside of SAS.;
data work.secure3 (encrypt=yes pw=Scramble);
set work.sales;
run;
proc sort data=work.secure3 (pw=scramble) out=work.secure4;
by state sales;
y ;
run;
18. List of Common Options
SAS Data Set Option Description
Data Set COMPRESS= Data Set Controls the compression of
Attributes Option observations in an output SAS data
set
GENMAX= Data Set Requests generations for a data set
Option and specifies the maximum number
of versions
INDEX D
INDEX= Data S
Set D fi
Defines i d
indexes when a SAS d
h data set
Option is created
LABEL= Data Set Specifies a label for the SAS data set
Option
http://support.sas.com/onlinedoc/913/getDoc/en/lrdict.hlp/a000104210.htm
19. Examples: Data Set Attributes
data work.compress1 (compress=yes label=quot;Attempt at Compressionquot;);
set work.sales;
run;
data work masterSalesDataset (genmax=3);
work.masterSalesDataset (genmax 3);
set work.sales;
run;
d t work.masterSalesDataset;
data k t S l D t t
set work.masterSalesDataset work.selectObs1;
run;
data work.masterSalesDataset;
set work.sales work.selectObs1;
run;
20. Outline
• Define data set options
• Provide examples in four categories
• Di
Discuss d t set processing rules
data t i l
21. Input and Output Datasets
• If a data set option is associated with an
input data set, the action applies to the
data set that is being read.
read
• If the option appears in the DATA
statement or after an output data set
specification in a PROC step, SAS applies
the action to the output data set
set.
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
22. Input and Output Datasets
data
d t _null_;
ll
run;
;
data;
run;
data _null_;
set _null_;
null ;
if _n_ ge 0 then put 'hello';
run;
data _null_;
if _n_ ge 0 then put 'hello';
set _null_;
run;
23. Order of Execution
• When data set options appear on both input and
output data sets in the same DATA or PROC
step, SAS applies data set options to input data
sets before it evaluates programming statements
or before it applies data set options to output
data t
d t sets.
• Likewise, data set options that are specified for
the data set being created are applied after
programming statements are processed.
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
24. Order of Execution
data work.salesReformat4 (rename=(sales=monthlySales));
set work.sales;
sales = sales/12;
run;
data work.salesReformat5;
set work.sales (rename=(sales=monthlySales));
monthlySales = monthlySales/12;
run;
25. Specification Conflicts
• In some instances data set options
instances,
conflict when they are used in the same
statement For example you cannot
statement. example,
specify both the DROP= and KEEP=
options for the same variable in the same
statement.
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
26. Statement Definition
• A SAS statement is a series of items that
may include keywords, SAS names,
special characters and operators
characters, operators.
• All SAS statements end with a semicolon.
• A SAS statement either requests SAS to
t t t ith t t
perform an operation or gives information
to th
t the system.
t
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612375.htm
27. Timing Conflicts
• Timing can also be an issue in some
cases. For example, if using KEEP= and
RENAME
RENAME= on a data set specified in the
SET statement, KEEP= needs to use the
original variable names, because SAS will
names
process KEEP= before the data set is
read.
read The new names specified in
RENAME= will apply to the programming
statements that follow the SET statement
statement.
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
28. Timing Conflicts
proc sort data=work.sales (keep=sales state
rename=(sales=monthlySales))
out=work.salesReformat6;
out=work salesReformat6;
by state monthlySales;
run;
proc sort data=work.sales (rename=(sales=monthlySales)
keep=sales state)
out=work.salesReformat7;
by state monthlySales;
run;
29. Overriding System Options
• Many system options and data set options
share the same name and have the same
function.
• The data set option overrides the system
option for the data set in the step in which
p p
it appears.
• System options remain in effect for all
y p
DATA and PROC steps in a SAS job or
session, unless they are respecified.
http://support.sas.com/onlinedoc/913/getDoc/en/lrcon.hlp/a002612367.htm
30. Conclusion
• DATA set options allow features during
data step processing
• The SAS System Documentation provides
specific details on the syntax
31. Contact Information
• Mark Tabladillo
MarkTab Consulting
http://www.marktab.com/
http://www marktab com/