Personal Genomes: what can I do with my data?Melanie Swan
Biology evolved to be just good enough to survive and genomics provides the critical next-generation toolkit for its greater exploitation. Genomics is already starting to be medically actionable and is likely to become increasingly useful over time. This presentation discusses how your genetic information is already useful today,
Morphologomics - Challenges for Surgical Pathology in the Genomic Age by Dr. ...Cirdan
This presentation introduces and discussesthe concept of ‘morphologomics’ that is omics approaches critically reimagined and reappraised from the viewpoint of classic morphology.
It was delivered by Dr. Anthony Gill at the Pathology Horizons 2017 conference in Cairns, Australia.
introduce and discuss the concept of ‘morphologomics’ that is omics approaches critically reimagined and reappraised from the viewpoint of classic morphology.
Drug Repositioning Conference Washington DC 20190923Tudor Oprea
Discussing the knowledge-based classification of human proteins and its applications in target repurposing discovery, with potential applications for Rare Diseases
Covering our on-going Machine Learning efforts using Protein Knowledge Graphs and MetaPath / XGBoost to predict novel protein-disease associations. Specific Examples for Type 2 Diabetes.
Computational Drug Repositioning Workflow.
Addressing the limitations and potential of machine learning in target and drug repurposing.
Drug Repositioning Candidates: Alprazolam / Glycopyrronium / Oteracil.
Personal Genomes: what can I do with my data?Melanie Swan
Biology evolved to be just good enough to survive and genomics provides the critical next-generation toolkit for its greater exploitation. Genomics is already starting to be medically actionable and is likely to become increasingly useful over time. This presentation discusses how your genetic information is already useful today,
Morphologomics - Challenges for Surgical Pathology in the Genomic Age by Dr. ...Cirdan
This presentation introduces and discussesthe concept of ‘morphologomics’ that is omics approaches critically reimagined and reappraised from the viewpoint of classic morphology.
It was delivered by Dr. Anthony Gill at the Pathology Horizons 2017 conference in Cairns, Australia.
introduce and discuss the concept of ‘morphologomics’ that is omics approaches critically reimagined and reappraised from the viewpoint of classic morphology.
Drug Repositioning Conference Washington DC 20190923Tudor Oprea
Discussing the knowledge-based classification of human proteins and its applications in target repurposing discovery, with potential applications for Rare Diseases
Covering our on-going Machine Learning efforts using Protein Knowledge Graphs and MetaPath / XGBoost to predict novel protein-disease associations. Specific Examples for Type 2 Diabetes.
Computational Drug Repositioning Workflow.
Addressing the limitations and potential of machine learning in target and drug repurposing.
Drug Repositioning Candidates: Alprazolam / Glycopyrronium / Oteracil.
Realize preventive medicine through predictive risk profiling, determining baseline markers of wellness and variability, and engaging in personalized pre-clinical interventions
Bioinformatics: Introduction, Objective of Bioinformatics, Bioinformatics Databases, Concept of Bioinformatics, Impact of Bioinformatics in Vaccine Discovery
Genomics in Society: Genomics, Cellular Networks, Preventive Medicine, and So...Larry Smarr
10.10.06
Guest Lecture
UCSD Medical and Pharmaceutical Students Foundations of Human Biology--Lecture #41
Title: Genomics in Society: Genomics, Cellular Networks, Preventive Medicine, and Society
La Jolla, CA
Towards Digitally Enabled Genomic Medicine: the Patient of The FutureLarry Smarr
12.02.22
Invited Speaker
Hacking Life
TTI/Vanguard Conference
Title: Towards Digitally Enabled Genomic Medicine: the Patient of The Future
San Jose, CA
Discuss about Al, machine learning, and the hype cycle
Discuss the knowledge-based classification of proteins
Discuss applications of AI/ML to drug discovery
Bioinformatics in the Clinical Pipeline: Contribution in Genomic Medicineiosrjce
In this review report we like to focus on the new challenges in methodology of modern biology be
used in medical science. Today human health is a primary issue to cure disease, undoubtedly the answer to this
is bioinformatics or (In-silco) tools has change the concept of treating patients to understand the need of
genomic medicine in use. Those with new modes of action in clinical treatment, is a major health concern in
medical science. On global prospective scientific role in constructing new ideas to remediate health care to
treat disease exciting in nature is challenging task. So awareness needs to accelerate store clinical datasets for
scientific represents to design genomic drugs. This new outline will drive the medical to discover public data
and create a cognitive approach to use technology cheaper at cost effective mode.
With advances in technology, enormous amounts of data have become available for bioscience researchers. While this high volume of information holds tremendous promise for expanding the science knowledge base, it must be organized for meaningful study. Bioinformatics is a discipline that devises methods for storing, distributing, and analyzing biological data used by diverse areas of research. Bioinformatics professionals develop software and tools that assist researchers in the analysis of data related to molecular biology and genome studies.
Computational challenges in precision medicine and genomicsGary Bader
Genomics is mapping complex data about human biology and promises major medical advances. In particular, genomics is enabling precision medicine, the use of a patient's genome and physiological state to improve therapeutic efficacy and outcome. However, routine use of genomics data in medical research is in its infancy, due mainly to the challenges of working with "Big data". These data are so complex and large that typical researchers are not able to cope with them. Collectively, these data require an understanding of many aspects of experimental biology and medicine to correctly process and interpret. Data size is also an issue, as individual researchers may need to handle tens of terabytes (genomes from a few hundred patients), which is challenging to download and store on typical workstations. To effectively support precision medicine, scientists from a wide range of disciplines, including computer science, must develop algorithms to improve precision medicine (e.g. diagnostics and prognostics), genome interpretation, raw data processing and secure high performance computing.
Realize preventive medicine through predictive risk profiling, determining baseline markers of wellness and variability, and engaging in personalized pre-clinical interventions
Bioinformatics: Introduction, Objective of Bioinformatics, Bioinformatics Databases, Concept of Bioinformatics, Impact of Bioinformatics in Vaccine Discovery
Genomics in Society: Genomics, Cellular Networks, Preventive Medicine, and So...Larry Smarr
10.10.06
Guest Lecture
UCSD Medical and Pharmaceutical Students Foundations of Human Biology--Lecture #41
Title: Genomics in Society: Genomics, Cellular Networks, Preventive Medicine, and Society
La Jolla, CA
Towards Digitally Enabled Genomic Medicine: the Patient of The FutureLarry Smarr
12.02.22
Invited Speaker
Hacking Life
TTI/Vanguard Conference
Title: Towards Digitally Enabled Genomic Medicine: the Patient of The Future
San Jose, CA
Discuss about Al, machine learning, and the hype cycle
Discuss the knowledge-based classification of proteins
Discuss applications of AI/ML to drug discovery
Bioinformatics in the Clinical Pipeline: Contribution in Genomic Medicineiosrjce
In this review report we like to focus on the new challenges in methodology of modern biology be
used in medical science. Today human health is a primary issue to cure disease, undoubtedly the answer to this
is bioinformatics or (In-silco) tools has change the concept of treating patients to understand the need of
genomic medicine in use. Those with new modes of action in clinical treatment, is a major health concern in
medical science. On global prospective scientific role in constructing new ideas to remediate health care to
treat disease exciting in nature is challenging task. So awareness needs to accelerate store clinical datasets for
scientific represents to design genomic drugs. This new outline will drive the medical to discover public data
and create a cognitive approach to use technology cheaper at cost effective mode.
With advances in technology, enormous amounts of data have become available for bioscience researchers. While this high volume of information holds tremendous promise for expanding the science knowledge base, it must be organized for meaningful study. Bioinformatics is a discipline that devises methods for storing, distributing, and analyzing biological data used by diverse areas of research. Bioinformatics professionals develop software and tools that assist researchers in the analysis of data related to molecular biology and genome studies.
Computational challenges in precision medicine and genomicsGary Bader
Genomics is mapping complex data about human biology and promises major medical advances. In particular, genomics is enabling precision medicine, the use of a patient's genome and physiological state to improve therapeutic efficacy and outcome. However, routine use of genomics data in medical research is in its infancy, due mainly to the challenges of working with "Big data". These data are so complex and large that typical researchers are not able to cope with them. Collectively, these data require an understanding of many aspects of experimental biology and medicine to correctly process and interpret. Data size is also an issue, as individual researchers may need to handle tens of terabytes (genomes from a few hundred patients), which is challenging to download and store on typical workstations. To effectively support precision medicine, scientists from a wide range of disciplines, including computer science, must develop algorithms to improve precision medicine (e.g. diagnostics and prognostics), genome interpretation, raw data processing and secure high performance computing.
Alexandra Basford, InCoB 2011: A Journal’s Perspective on Data Standards and ...GigaScience, BGI Hong Kong
Alexandra Basford's talk in the curation session at the InCoB meeting in Kuala Lumpar, 30/11/11 on: GigaScience: A Journal’s Perspective on Data Standards and Biocuration
Presentation -Intelligence Enhancer and Genius 3.0 智能增长以及天才3.0Hang Wu
Traditional genius usually have the problems of either genetic abnormalities or unusual social political behaviors. These what we called the Genius 1.0 and Genius 2.0. The new kind of genius, using technology to boost their intelligence while maintaining their humanity, is proposed as the next evolution of human being.
传统的天才即使在基因上拥有很大的优势,因为社会政治的原因,导致了他们无法被世界所接受。因此,我设计出了天才3.0的概念,用来解释利用现代神经工程学提升大脑的人。
The Sixth Sense is the Basic Latest Technology. It is the a wearable gestural interface that augments the physical world around us with digital information
The Noetic perspective (from Greek: noetikos- mental; nous- mind) identifies the [human] mind as the nexus of the future evolution of humanity. At present, human evolution is a mental process rather than biological or technological process.
The Noetic model describes mind as a relation generating complex system arising as a product of biological evolution and manifesting certain defining characteristics such as systemic closure, self reference, plasticity, etc. This model aims to integrate a systemic view with the mental constructs of the subjective plane. According to the Noetic model, human identity is a dynamic constructive process that brings forth the human observer as the subject of its perceptive and mental states. This process is identified as mind. Images and narratives are the elements encompassing the experiential and mental aspects of the identity process as they appear to the human observer.
The idea of mind as the theater of evolutionary processes is further explored: Mind as a complex system can essentially be disassociated from the historical conditions of its emergence; therefore it is virtually unbound in its evolutionary potential. This has deep implications on the understanding of human nature and the human condition. Finally, the ideas of openness and freedom beyond utility are proposed as futuristic directives of consciously guided evolution of mind.
What computational principles explain the success of human intelligence? I will describe recent work that combines together the unbounded flexibility of mathematical logic with the robustness of statistical inference. This combination brings us several steps closer to understanding human intelligence -- and to the tools for true intelligence engineering.
Noah D. Goodman is a research scientist in the Department of Brain and Cognitive Sciences at MIT, and a member of the Computer Science and Artificial Intelligence Laboratory. He studies the computational basis of human thought, merging behavioral experiments with formal methods from statistics and logic. He received his Ph.D. in mathematics from the University of Texas at Austin. After a brief stint as a Chicago real estate developer, he joined the Computational Cognitive Science group at MIT. Goodman has published more than thirty publications in psychology, cognitive science, artificial intelligence, and mathematics. Several of these papers have won awards.
Military 2.0 - Patrick Lin - H+ Summit @ HarvardHumanity Plus
For better or worse, the military is a major driver of technological, world-changing innovations, such as the Internet. At the same time, wars and armed conflicts are a key roadblock in the evolution of humanity. Therefore, to understand how emerging technologies will change our lives, we must look at their military origins as a harbinger of things to come for society at large. This presentation will focus on ethical and policy questions arising from two key areas making headlines today and in the future: human enhancement technologies and robotics.
For instance, are there moral or practical issues with eliminating human emotions such as fear or anger, which have led to abuses and accidents in wartime? Must these enhancements (and others, such as super-strength) be temporary or reversible, considering that soldiers usually return to civilian life? Robots can discourage such abuses if equipped with cameras, becoming objective and unblinking observers on the battlefield, but would this erode cohesion and trust among soldiers – and in the civilian realm, would surveillance robots infringe on our privacy? Generally, would these new technologies make it easier to engage in war, since they would lower political costs by reducing the number of casualties on our side – if so, is it immoral, or otherwise counterproductive to humanity's progress, to develop these capabilities?
Patrick Lin is the director of the Ethics + Emerging Sciences Group , based at California Polytechnic State University, San Luis Obispo. Most recently, he has led research efforts that culminated in two major reports: Autonomous Military Robotics: Risk, Ethics, and Design (funded by the U.S. Dept. of Defense/Navy, 2008) and Ethics of Human Enhancement: 25 Questions & Answers (funded by the U.S. National Science Foundation, 2009). He has published several books and papers in the field of technology ethics, including a new monograph What Is Nanotechnology and Why Does It Matter?: From Science to Ethics (Wiley-Blackwell, 2010) and a forthcoming anthology Robot Ethics: The Social and Ethical Implication of Robotics (MIT Press, in preparation). Dr. Lin earned his B.A. from University of California at Berkeley, M.A. and Ph.D. from University of California at Santa Barbara, and completed a three-year post-doctoral appointment at Dartmouth College. He is currently an assistant professor in Cal Poly’s philosophy department and an ethics fellow at the U.S. Naval Academy.
Presentation given by Amon Tywman, DPhil, to UKH+, 11th July 2009.
"Extreme Simulation Scenarios: Thinking about the promise, risk, and plausibility of AI & VR"
Genomic self-hacking: citizen science and the realization of personalized me...Melanie Swan
Quantified self ignite talk. Redesigning biology may be man's ultimate artistic and scientific exploit. The first steps are reading and writing genetic data with DNA sequencing and synthetic biology. Already human genome sequencing costs have declined such that individuals worldwide are accessing their own genomic data, and can explore it through open-source science communities such as DIYgenomics.
DIYgenomics: An Open Platform for Democratizing the GenomeMelanie Swan
Redesigning biology may be man's ultimate artistic and scientific exploit. The first steps are reading and writing genetic data with DNA sequencing and synthetic biology. Already human genome sequencing costs have declined such that individuals worldwide are accessing their own genomic data, and can explore it through open-source science communities such as DIYgenomics.
Patient-Organized Genomic Research StudiesMelanie Swan
DIYgenomics has developed a methodology for the conduct of patient-organized genomic research studies, obtaining outcomes by linking genomic data to phenotypic data and intervention. The general hypothesis is that individuals with one or more polymorphisms in the main variants associated with conditions may be more likely to have baseline out-of-bounds phenotypic biomarker levels, and could benefit the most from targeted intervention.
DIYgenomics: an open platform for citizen scienceMelanie Swan
DIYgenomics (www.DIYgenomics.org) is a new platform bringing citizen scientists together to run peer cohort research studies and conduct novel research linking genetic data and physical biomarkers. Some norms are developing in response to the variety of community-based research issues that arise such as adaptive studies, informed consent, security, anonymity, and study design.
A look at future directions for biology. Personalized genomics is a key step in moving towards individualized medicine and preventative interventions. The traditional trial and error approach of molecular biology is being replaced by the direct design of synthetic biology. Synthetically developed energy solutions could have a substantial impact on natural resource demand.
DIYgenomics (www.DIYgenomics.org) is a new platform bringing citizen scientists together to run peer cohort research studies and conduct novel research linking genetic data and physical biomarkers. Some norms are developing in response to the variety of community-based research issues that arise such as adaptive studies, informed consent, security, anonymity, and study design.
Genetic Engineering in AgricultureFew topics in agriculture are .docxhanneloremccaffery
Genetic Engineering in Agriculture
Few topics in agriculture are more polarizing than genetic engineering (GE), the process of manipulating an organism s genetic material—usually using genes from other species—in an effort to produce desired traits such as higher yield or drought tolerance.
GE has been hailed by some as an indispensable tool for solving the world s food problems, and denounced by others as an example of human overreaching fraught with unknown, potentially catastrophic dangers. UCS experts analyze the applications of genetic engineering in agriculture—particularly in comparison to other options—and offer practical recommendations based on that analysis.
Benefits of GE: Promise vs. Performance
Supporters of GE in agriculture point to a multitude of potential benefits of engineered crops, including increased yield, tolerance of drought, reduced pesticide use, more efficient use of fertilizers, and ability to produce drugs or other useful chemicals. UCS analysis shows that actual benefits have often fallen far short of expectations.
Health and Environmental Risks
While the risks of genetic engineering have sometimes been exaggerated or misrepresented, GE crops do have the potential to cause a variety of health problems and environmental impacts. For instance, they may produce new allergens and toxins, spread harmful traits to weeds and non-GE crops, or harm animals that consume them.
At least one major environmental impact of genetic engineering has already reached critical proportions: overuse of herbicide-tolerant GE crops has spurred an increase in herbicide use and an epidemic of herbicide-resistant "superweeds," which will lead to even more herbicide use.
How likely are other harmful GE impacts to occur? This is a difficult question to answer. Each crop-gene combination poses its own set of risks. While risk assessments are conducted as part of GE product approval, the data are generally supplied by the company seeking approval, and GE companies use their patent rights to exercise tight control over research on their products. In short, there is a lot we don't know about the risks of GE—which is no reason for panic, but a good reason for caution.
What Other Choices Do We Have?
All technologies have risks and shortcomings, so critics must always address the question: what are the alternatives? In the case of GE, there are two main answers: crop breeding, which produces traits through the organism s reproductive process; and agroecological farm management, which seeks to make the most of a plant s existing traits by optimizing its growing environment. These approaches are generally far less expensive than GE, and often more effective.
The biotechnology industry has acknowledged the value of breeding as a complement to GE. But at the same time, the industry has used its formidable marketing and lobbying resources to ensure that its products—and the industrial methods those products are designed to support—continue to dominat ...
AI Health Agents: Longevity as a Service in the Web3 GenAI Quantum RevolutionMelanie Swan
Health Agents are a form of Math Agent as the concept of a personalized AI health advisor delivering “healthcare by app” instead of “sickcare by appointment.” Mobile devices
can check health 1000 times per minute as opposed to the standard one time per year doctor’s office visit, and model virtual patients in the digital twin app. As any AI agent, Health Agents “speak” natural language to humans and formal language to the computational infrastructure, possibly outputting the mathematics of personalized homeostatic health as part of their operation. Health Agents could facilitate the ability of physicians to oversee the health of thousands of individuals at a time. This could ease overstressed healthcare systems and contribute to physician well-being and the situation that (per the World Health Organization) more than half of the global population is still not covered by essential health services.
The computational infrastructure is becoming a vast interconnected fabric of formal methods, including per a major shift from 2d grids to 3d graphs in machine learning architectures
The implication is systems-level digital science at unprecedented scale for discovery in a diverse range of scientific disciplines
We know that we are in an AI take-off, what is new is that we are in a math take-off. A math take-off is using math as a formal language, beyond the human-facing math-as-math use case, for AI to interface with the computational infrastructure. The message of generative AI and LLMs (large language models like GPT) is not that they speak natural language to humans, but that they speak formal languages (programmatic code, mathematics, physics) to the computational infrastructure, implying the ability to create a much larger problem-solving apparatus for humanity-benefitting applications in biology, energy, and space science, however not without risk.
This work introduces “quantum intelligence” as a concept of intelligence for operating in the quantum realm may help in a potential AI-Quantum Computing convergence (~2030e), and towards the realization of SRAI for well-being (economics, health, energy, space). “Scale-free intelligence” is formulated as a generic capacity for learning.
AI did not spring onto the scene with chatGPT, but is in an ongoing multi-year adoption. A transition may be underway from an information society to a knowledge society (one tempered and specifically using knowledge to improve the human condition). AI is a dual-use technology with both significant risk and upleveling possibilities.
SRAI for well-being is a social objective, and also a technological objective. SRAI is part of AI development and within the technological trajectory of harnessing all scales of physical reality ranging from quantum materials to space exploration.
Conceptually, thinking in quantum and relativistic terms expands the physical worldview, and likewise the social worldview of entities inhabiting the larger world. Practically, SRAI may be realized in phases: short-term regulation and registries, medium-term agents learning to implement human values with internal reward functions, and long-term responsible human-AI entities acting in partnership in a future of SRAI for well-being.
The Human-AI Odyssey: Homerian Aspirations towards Non-labor IdentityMelanie Swan
The visionary progression in The Odyssey from shipbuilding to seafaring to advanced civilization informs contemporary tension in the human-AI relation forcing a broader articulation of human-identity beyond labor-identity. Edith Hall analyzes why one of the earliest known literatures, The Odyssey, remains a central cultural trope with numerous references in the storytelling vernacular of all eras, ranging from 1860s British theater to a highly-watched 1990 episode of The Simpsons. The argument is that The Odyssey provides a constant aspirational reference for human identity – who we think we are and where we are going on the epic journey of life, especially at the current crossroad in our relationship with technology.
The contemporary moment finds humanity, and the humanities, experiencing an identity crisis in the relationship with technology. Information science is having an ever more pervasive role in academia, and the machine economy continues to offload vast classes of tasks to labor-saving technology giving rise to two questions. First, at the level of labor-identity, humans wonder who they are as they have long defined their sense of self through their professional participation in the economy. Second, at the level of human-identity, with AI now performing cognitive labor in addition to physical labor, humans wonder if there is anything that remains uniquely human.
The effect of The Odyssey is to provide world-expanding imaginaries to change the way we see ourselves as subjects; in this way, Homer is an early modernist in reconfiguring our self-concept.
This work applies a philosophy (of literature)-aided information science method to discuss how Homer’s Odyssey persists as a literary imaginary to help us think through potential futures of human-AI flourishing as rapid automation continues to impact humanity. The intensity of the human-AI relation is likely to increase, which invites thought leadership to steward the transition to a potential AI abundance economy with fulfilling human-technology collaboration.
The shipbuilding-seafaring-advanced civilization progression in The Odyssey identifies that the human-AI relation is not one of the labor-identity-crisis of “robots stealing our jobs,” but rather one of the more difficult challenge of envisioning who we can be in the new larger world of human-AI partnership addressing a larger set of planetary-scale problems. Towards this new configuration of human-AI relation, the longer-term may hold radically different notions of identity, as we become physical-virtual hybrids, augmented post-disease entities in the health-faring, space-civilizing, energy-marshalling post-scarcity cultures of the future.
AdS Biology and Quantum Information ScienceMelanie Swan
Quantum Information Science is a fast-growing discipline advancing many areas of science such as cryptography, chemistry, finance, space science, and biology. In particular AdS/Biology, an interpretation of the AdS/CFT correspondence in biological systems, is showing promise in new biophysical mathematical models of topology (Chern-Simons (solvable QFT), knotting, and compaction). For example, one model of neurodegenerative disease takes a topological view of protein buildup (AB plaques and tau tangles in Alzheimer’s disease, alpha-synuclein in Parkinson’s disease, TDP-43 in ALS). AdS/Neuroscience methods are implicated in integrating multiscalar systems with different bulk-boundary space-time regimes (e.g. oncology tumors, fMRI + EEG imaging), entanglement (correlation) renormalization across scales (MERA, random tensor networks, melonic diagrams), entropy (possible system states), entanglement entropy (interrelated fluctuations and correlations across system tiers), and non-ergodicity (implied efficiency mechanisms since biology does not cycle through all possible configurations per temperature (thermotaxis), chemotaxis, and energy cues); Maxwell’s demon of biology (partition functions), conservation across system scales (biophysical gauge symmetry (system-wide conserved quantity)), and the presence of codes (DNA, codons, neural codes). A multiscalar AdS/CFT correspondence is mobilized in 4-tier ecosystem models (light-plankton-krill-whale and ion-synapse-neuron-network (AdS/Brain)).
Humanity’s constant project is expanding the range of attainable geography. Melville’s romance of the sea gives way to Kerouac’s romance of the road, and now the romance of space. In expanding into new geographies, markets (commerce) is the driving impulse, entailing a legal and judiciary system to order the new larger continuous marketplace, which brings a bigger overall scope of world under our control, and hence a new idea of who we are as subjects in this bigger domain.
Space Humanism is a concept of humanism based on the principles of inclusion, progress, and equity posited as a condition of possibility for a potential large-scale human movement into space. A philosophy of literature approach is used to contextualize Space Humanism, first through Melville-Foucault to articulate the mind-frame of extra-planetary geographies as one of human expansion, and second through posthuman philosophy extending from Shakespeare’s Renaissance humanism to contemporary enhancement-based theories of subjectivation.
Historical imaginaries outline subjectivation moments that have changed the whole notion who we are as humanity. Four examples are: the concept of the “new world” in Hegel’s philosophy, von Humboldt’s infographic maps, Baudelaire as the Painter of Modern Life, and Keats’s seeing the world in a new way upon reading an updated translation of Homer.
The reach to beyond-Earth geographies is a two-cultures project involving both arts and science. Technical competence is necessary to realize the aspirational, explorational, and survivalist aims of humanity pushing beyond planetary limits. Space was once a fantastic dream that is becoming quotidian with fourteen U.S. spaceports, six completed Blue Origin space tourist missions, and SpaceX having over 155 successful rocket launches including human space flights to and from the International Space Station. The notion of Space Human articulated through Shakespeare, Moby-Dick, and neuroenhancement informs the project of our reach to awaiting beyond-Earth geographies.
Quantum Information Science and Quantum Neuroscience.pptMelanie Swan
Mathematical advance in quantum information science is proceeding quickly and applies to many fields, particularly the complexities of neuroscience (here focusing on image-readable physical behaviors such as neural signaling, as opposed to higher-order operations of cognition, memory, and attention). Quantum mathematical models are extensible to neuroscience problem classes treating dynamical time series, diffusion, and renormalization in multiscalar systems. Approaches first reconstruct wavefunctions observed in EEG and fMRI scans. Second, single-neuron models (Hodgkin-Huxley, integrate-and-fire, theta neurons) and collective neuron models (neural field theories, Kuramoto oscillators) are employed to model empirical data. Third, genome physics is used to study time series sequence prediction in DNA, RNA, and proteins based on 3d+ complex geometry involving fields, curvature, knotting, and information compaction. Finally, quantum neuroscience physics is applied in AdS/Brain modeling, Chern-Simons biology (topological invariance), neuronal gauge theories, network neuroscience, and the chaotic dynamics of bifurcation and bistability (to explain epileptic and resting states). The potential benefit of this work is an improved understanding of disease and pathology resolution in humans.
Quantum information science enables a new tier of scientific problem-solving as exemplified in early-adopter fields, foundational tools in quantum cryptography, quantum machine learning, and quantum chemistry (molecular quantum mechanics), and advanced applications in quantum space science, quantum finance, and quantum biology
Grammatology and Performativity: A Critical Theory of Silence: Silence is a crucial device for subversion, opposition, and socio-political commentary, the theoretical underpinnings of which are just starting to be understood. This work illuminates another position in the growing field of critical silence studies, theorizing silence as an asset whose ontological value has been lost in a world of literal and figurative noise. Part 1 philosophizes silence as a continuation of Derrida’s grammatology project. Such a grammatology of silence valorizes silent thinking over noisy speaking, and identifies the deconstructive binary pairing not as silence-speaking, but rather as silence-noise. Noise has a simultaneous physical-virtual existence as Shannon entropy calculates signal-to-noise ratios in modern communications networks. Part 2 employs the philosophy of noise to assess what is conceptually necessary to overcome noise in a critical theory of silence. Malaspina draws from Simondon to argue that noise is a form of individuation, essentially a living thing with unstoppable growth potential, not defined by a binary on-off switch but as a matter of gradation. Hence different theory resources are required to oppose it. Part 3 then develops a critical theory of silence to oppose noise in both its physical and virtual instantiations, with the two arms of a deeply human positive performativity (Szendy, Bennett) and a beyond-computational posthumanism (Puar). The result is a novel critical theory of silence as positive performativity that destabilizes noise and recoups the ontological status of silence as not merely an empty post-modern reification but a meaningful actuality.
Philosophy-aided Physics at the Boundary of Quantum-Classical Reality The philosophical themes of truth-knowledge and appearance-reality are used to interrogate the contemporary situation of the quantum-classical boundary, and more broadly the quantum-classical-relativistic stratification of physical scale boundaries. The contemporary moment finds us at breakneck pace in the industrial information revolution, digitizing remaining matter-based industries into a seamless exchange between physical-digital reality. Digitized news is giving way to digitized money and perhaps in the farther future, digitized mindfiles (such as personalized connectome files for precision medicine, autologous (own-DNA) stem cell therapies, and CRISPR for Alzheimer’s disease prevention). Our technologies are allowing us control over vast new domains, the relativistic with GPS and space-faring, and the quantum with quantum computing, harnessing the properties of superposition, entanglement, and interference. Philosophy provides critical thinking tools that can help us understand and master these rapid shifts in science and technology to avoid an Adornian instrumental reality (subsuming humanity under societal structures) and to maintain a Heideggerian backgrounded and enabling relation with technology (versus technology enframing us into mindless standing reserve).
The philosophical theme underlying the investigation of the scales of planets, persons, and particles is the relationship between truth and knowledge (or appearance and reality). The truth-knowledge problem is whether knowledge of the truth, true knowledge, the reality under the appearance, is even possible. Three salient moments in the history of the truth-knowledge problem are examined here. These are the German idealism of Kant and Hegel, the deconstructive postmodernism of Foucault and Derrida, and the unclear leanings of the current moment. The German idealism lens incorporates the self-knowing subject as agent into the truth and knowledge problem. The postmodernist view breaks with the subject and emphasizes the hidden opposites in the formulations, the constant reinterpretation of meaning, and porous boundaries. The contemporary moment wonders whether truth-knowledge boundaries still hold, in a Benjaminian view of non-identity between truth and knowledge, and truth increasingly being seen as a Foucauldian biopolitical manufactured quantity. Contemporaneity has a bimodal distribution of the subject: the hyperself (the constantly digitally represented selfie self) and the alienated post-subject subject.
These moments in the truth and knowledge debate inflect into the scale considerations of relativity, classicality, and quantum mechanics. Whereas general relativity and quantum mechanics are domains of universality, totality, and multiplicity, everyday classical reality is squeezed in as a belt between the two multiplicities as the concretion of drawing a triangle or tossing a ball. Recasting truth and k
Comprehensive philosophical programs arise within a historical context (for Hegel and Derrida in the democracy-shaping moments of the French Revolution (1789) and the student-worker protests (1968) in which French politics serve as a global harbinger of contemporary themes). In the Derrida-Hegel relationship, there is more rapprochement concerning core notions of difference, history, and meaning-assignation than may have been realized. In particular, Hegel’s philosophy, despite being assumed to be a totalizing system, in fact indicates precisely some of the same kinds of revised metaphysics-of-presence formulations that Derrida exhorts, namely those that are flexible, expansive, and include non-identity and identity.
A crucial Derrida-Hegel interchange is that of différance and difference. Derrida develops the notion directly from Hegel (“Différance,” “The Pit and the Pyramid”), but only draws from the Encyclopedia, not Hegel’s masterwork, the Phenomenology of Spirit. For Derrida, the “A” in différance is inspired by the form of the pyramid in the capitalized letter and in Hegel’s comparing the sign “to the Egyptian Pyramid” (“Différance,” p. 3). Derrida invokes the symbolism of the pyramid, antiquity, and Egyptian hieroglyphics as an early semiotic system. However, when considering Hegel’s central definition of difference in the dialectical progression of thesis-antithesis-synthesis in the Phenomenology of Spirit (§§159-163), the articulations of différance and difference are remarkably aligned.
Parallel formulations are also seen in history as a series of reinterpretable events, and indexical wrappers as a mechanism for meaning assignation. The thinkers examine the universal and the particular by exploring regulative mechanisms such as law (natural and social). In Glas, Derrida highlights not the singular-universal relation, but the law of singularity and the law of universality relation as being relevant to Hegel’s Antigone interpretation (Glas, p. 142a), a theme continued in “Before the Law.” Finally (time permitting), there is a question whether the most valid critiques of Hegel (Nietzsche’s unreason and Benjamin’s non-synthesis), as alternatives to Hegelian dialectics, are visible in Derrida’s thought.
The upshot is that the two thinkers produce similar formulations, derived from different trajectories of philosophical work; a situation which points to the potential universality of fundamental solution classes to open-ended philosophical problems, including the future of democracy.
Quantum Moreness: Kantian Time and the Performative Economics of Multiplicity
There is no domain with greater moreness than that of the quantum. A philosophy-aided physics approach (postmodernism and Continental philosophy) examines the contemporary situation of quantum moreness (more time and space dimensions than are available classically). Quantum moreness is configured by quantum reality being probabilistic; a multiplicity of outcomes all co-existing in superposition until collapsed in measurement. The quantum mindset uses quantum moreness to solve problems by thinking in terms of the greater scalability afforded in time and space with the quantum properties of superposition, entanglement, and interference. Quantum studies fields proliferate in arts and sciences, raising the Levi-Straussian raw-cooked dilemma of how “traditional humanities” are to be named alongside “digital humanities” and “quantum humanities.” Kant facilitates the conceptualization of quantum moreness by insisting on the dual nature of time as transcendentally ideal and empirically real. Kant’s moreness is allness, the absolute totality and multiplicity of time at the ideal level. Each faculty (sensibility, understanding, reason) has its own species of the a priori synthetic unity of ideal time that precedes and conditions the operation of the faculty. Each faculty also has a concretized formulation of empirically-real time as the time series, which is the basis for the faculties to interoperate to perform the conception of any empirical object. Kant’s achievement of time interoperability has potential extensibility to other areas of temporal incompatibility such as the scales of general relativity, Newtonian mechanics (human-scale), and quantum mechanics. The quantum moreness mindset with which Kant connects the ideal-real is visible in the domain of economics, itself too an ideal-real construction. The quantum moreness of money configures the postmodern abstraction of global cryptocurrencies and smart contract pledges, the implicative hope of which is a post-debt capital world that restores the human esprit in the face of an increasingly intense technologized reality.
Blockchain Crypto Jamming: Subverting the Instrumental Economy
The ultimate subversion is money, refusing the pecuniary resources of the state. This project applies a philosophical and critical theory lens to examine the use of nomenclature in one of the most radical longitudinal transformations in contemporary times, the shift away from state-run monetary resources towards cryptocurrencies and smart contracts in citizen-determined decentralized financial networks.
A Cryptoeconomic Theory of Social Change is presented in which linguistic progression serves as a tracking mechanism. The steps to lasting change have their own vocabulary (Brandom). First, there is the social critique, the complaint about what is wrong, the negative side (Adorno and Horkheimer highlight instrumental reason and the empty culture industry). Second, there is the antidote, an alternative that can overcome the complaint, the positive side. Third, the solution becomes the new reality, and as a consequence, the whole of reality is now seen in this context, adopting its vocabulary (“fiat health” system for example, referring to the antiquated method). The social movement graduates from language game (Wittgenstein) to form of life (Jaeggi).
Blockchains are Occupy with teeth, notable in the level of personal responsibility-taking by individuals to steward their own financial resources. The crypto citizen is not merely trading CryptoKitties and Bored Ape Yacht Club tokens, but getting blocktime loans through DeFi liquidity pools instead of fiat banks, earning labor income in crypto, and shifting all economic activity to blockchain networks. The artworld signals mainstream acceptance with Christie’s non-fungible token digital artwork auctioned from Beeple for $61 million. At the global level, coin communities constitute a new form of Kardashev-level (planetary-scale) democracy. Blockchains emerge as a robust smart network automation technology for super-class projects ranging from space-faring to quantum computing and thought-tokening. The further stakes of this work are having a language-based theory of social change with broad applicability to social transformation.
This work argues that the emerging understanding of time in quantum information science can be articulated as a philosophical theory of change. Change and time are interrelated, and one can be used to interrogate the other, namely, a theory of change can be derived from a theory of time. What is new in quantum science is time being regarded as just another property to be engineered. At the quantum scale, time is reversible in certain ways, which is quite different from the everyday experience of time whose unidirectional arrow does not allow a dropped egg to reassemble. At the quantum scale of atoms, though, a particle retains the history of its trajectory, which may be retraced before collapsed in measurement.
Quantum scientists evolve systems backward and forward in time, controlling phase transitions with Floquet engineering. Quantum systems are entangled in time and space, with temporal correlations exhibiting greater multiplicity than spatial correlations. The chaotic time regimes of ballistic spread followed by saturation are implemented in quantum walks for faster search and heightened cryptosecurity. In quantum neuroscience, seizure may be explained by chaotic dynamics and normal resting state by Floquet-like periodic cycles. Time is revealed to have the same kinds of repeating structures as space (described by entanglement, symmetry, and topology), differently instantiated and controlled.
The quantum understanding of time can be propelled into a macroscale-theory of change through its connotation of a more flexible, malleable, probabilistic interface with reality. Change becomes less rigid. Probability is the lever of change, but notoriously difficult for humans to grasp, as we think better in storylines than statistics. The idea of manipulating quantum system properties in which time, space, dynamics (change), are all just parameters, is an empowering frame for the acceptance of change. The quantum mindset affords greater facility with probability-driven events (change).
Blockchains in Space: Non-Euclidean Spacetime and Tokenized Thinking - Two requirements for the large-scale beyond-terrestrial expansion of human intelligence into the universe are the ability to operate in diverse spatiotemporal regimes and to instantiate thinking in various formats. Newtonian mechanics describe everyday reality, but Einsteinian physics is needed for GPS and the orbital technologies of telescopes and spacecraft. Space agencies already integrate the Earth-day and the slightly-longer Martian-sol. A more substantial move into space requires facility with non-Euclidean spacetimes. One challenge is that general relativity and quantum mechanics are non-interoperable. However, the theories can be formulated together when considering black holes and quantum computing since geometric theories and gauge theories are both field-based. Quantum blockchains instantiate blockchain logic in quantum computational environments. Blockchains have their own temporal regime (blocktime: the number of blocks for an event to occur), and hence quantum blocktime is a non-classical functionality for operating in diverse spatiotemporal regimes. Thinking is a rule-based activity that is unrestricted by medium. Central to thinking is concepts, which are referenced by words. Word-types include universals, particulars, and indexicals which can be encoded into a formal system as thought-tokens, and registered to blockchains. Blockchains are contemplated as an automation technology for asteroid mining and space settlement construction, and thought-tokening adds an intelligence layer. Time and tokenized thinking come together in the idea of smart networks in space. In blockchain quantum smart networks, spatiotemporal regimes and thought-tokens are simply different value types (asset classes) coordinated with blockchain logic, towards the aim of extending human capabilities into the farther reaches of space.
Cryptography, entanglement, and quantum blocktime: Quantum computing offers a more scalable energy-efficient platform than classical computing and supercomputing, and corresponds more naturally to the three-dimensional structure of atomic reality. Blockchains are a decentralized digital economic system made possible by the 24-7 global nature of the internet.
Quantum Neuroscience: CRISPR for Alzheimer’s, Connectomes & Quantum BCIsMelanie Swan
This talk provides an introduction to quantum computing and how it may be deployed to study the human brain and its diseases of pathology and aging. Refined to its present state over centuries, the brain is one of the most complex systems known, with 86 billion neurons and 242 trillion synapses connected in intricate patterns and rewired by synaptic plasticity. Research continues to illuminate the mysteries of the brain. Quantum computing provides a more capacious architecture with greater scalability and energy efficiency than current methods of classical computing and supercomputing, and more naturally corresponds to the three-dimensional structure of atomic reality. The vision for quantum neuroscience is to model the nature of the brain exactly as it is, in three-dimensional atomically-accurate representations. Neuroscience (particularly genetic disease modeling, connectomics, and synaptomics) could be the “killer application” of quantum computing. Implementations in other industries are also important, including in quantum finance, quantum cryptography using Shor’s factoring algorithm (“the Y2K of Crypto”), Grover’s search, quantum chemistry, eigensolvers, quantum machine learning, and continuous-time quantum walks. Quantum computing is a high-profile worldwide scientific endeavor with platforms currently available via cloud services (IBM Q 27-qubit, IonQ 32-qubit, Rigetti 19Q Acorn) and is in the process of being applied in various industries including computational neuroscience.
Art Theory: Two Cultures Synthesis of Art and ScienceMelanie Swan
Thesis: Aesthetic resources contribute broadly to the human endeavor of progress, self-understanding, and science, beyond the immediate experience of art. Aesthetic Resources are frameworks, concepts, and modes of expression in art, literature, and philosophy that capture the imagination and the intellect through the senses. The role of art is to inspire the future: the romance of the sea, the open road, space.
The arts are a hallmark of civilization, but can their benefit be crystallized as aesthetic resources that can be mobilized to new situations? How can aesthetic resources help in moments of crisis?
A worldwide social identity crisis has been provoked by pandemic recovery, politics, equity, and environmental sustainability. Philosophical and aesthetic resources can help. Understanding art as a reflection of who we are as individuals and groups, this talk explores conceptualizations of art, with examples, in different periodizations from the 1800s to the present. A marquis definition as to what constitutes an artwork is Adorno’s, for whom the work must promulgate its own natural law and engage in novel materials manipulation. For many theorists, art is the pressing of our self-concept into concrete materiality (whether pyramids, sculpture, or painting). What do contemporary periodizations of art mean to our current and forward-looking self-concept? Recent eras include the neo-avant-gardes of 1945, the conceptual art of the 1960s, and post-conceptual art starting in the 1970s, produced generatively with found materials, the digital domain, and audience interactivity. What is the now-current idea of art? Is today’s Baudelairian flâneur and Balzacian modern hero incarnated in the quantum aesthetic imaginary and the digital cryptocitizen? Far from an “end of art” thesis sometimes attributed to Hegel, aesthetic practices are more relevant than ever. Individually and societally, we are reinventing creative energy and productive imagination in venues from science, technology, health, and biology to the arts.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Knowledge engineering: from people to machines and back
Scaling citizen science genomics
1. DIYgenomics Open-source preventive medicine and scaling citizen science genomics Melanie Swan Founder DIYgenomics 650-681-9482 @DIYgenomics www.DIYgenomics.org [email_address] July 28, 2011, OSCON, Portland OR Slides: http://slideshare.net/LaBlogga
2.
3. Biology is an information technology July 28, 2011 DIYgenomics.org Image credit: http://pubs.acs.org/cen/_img/87/i50/8750cover2_law.gif I hate you 010010010010000001101 0 000110 000 1011101000110010100100000011110010110111101110101 I love you 01001001001000000110110001101111011101100110010100100000011110010110111101110101 Image credit: http://www.nanoporetech.com/sequences 4th Gen: Electronic Sequencing
4. Biology is the information technology July 28, 2011 DIYgenomics.org Image credit: J. Craig Venter Institute Image credit: Anthony Atala lab Image credit: Thomas Matthiesen Artificial cell booted to life Algal biofuel Image credit: http://www.rexresearch.com Whole organ decellularization and recellularization (heart) Organ regeneration (urethra) DNA nanotechnology latch box for drug delivery Image credit: Aarhus University
5. Agenda July 28, 2011 DIYgenomics.org Citizen science progress to date Scaling citizen science Grand vision next steps Image credit: http://www.gettyimages.com
6.
7.
8. Consumer genomics test landscape July 28, 2011 DIYgenomics.org Single/few condition Multiple condition Whole genome Cost 1 Lower cost with family group or medical condition Service Breadth Public studies Matchmaking ScientificMatch $1,995 GenePartner $10-$99 Paternity Genelex $200-$475 Identigene $149-$399 Pregnancy Screening Counsyl $349 Nutrigenomics APO E Gene Diet $389 Inherent Health $99 Coriell 15 conditions Scripps (Navigenics) 28 conditions Pers. Genome Proj. Conditions undisclosed Harvard Med. Sch. Genetic disorders, Predisposition DNA Direct $200-$3,500 Matrix Genomics $199-$799 Drug sensitivity, $ Exome 23andme 201 conditions Navigenics* 40 conditions deCODEme 49 conditions $1,000 $99 $299 Genomics 71 conditions Pathway* $2,500 $999 $2,000 $985 *Must be physician-ordered Knome EdgeBio $6,000 $19,500 Knome Illumina $48,000 $350,000 $99,500 $68,500 $5,000 1 $39,500
9.
10. Consumer genomics: interpretation variance July 28, 2011 DIYgenomics.org Source: www.DIYgenomics.org and Swan, M. Multigenic Condition Risk Assessment in Direct-to-Consumer Genomic Services. Genet. Med. 2010, May;12(5):279-88. Private data upload: Marat Nepomnyashy
11.
12. Citizen science health landscape July 28, 2011 DIYgenomics.org Source: Extended from Swan, M. Emerging patient-driven health care models: an examination of health social networks, consumer personalized medicine and quantified self-tracking. Int. J. Environ. Res. Public Health 2009 , 2, 492-525. Health collaboration communities Health social networks
13. Lifecycle of a health condition July 28, 2011 DIYgenomics.org Pre-clinical (80%) Clinical (20%) Preventive medicine Long-tail medicine Self-tracking Wellness profiling Health community collaboration Applied healthspan engineering Traditional medicine Disease treatment Medical expertise Emergency Exceptions Time # conditions becoming clinical Goal: decrease in clinical conditions over time
14.
15.
16. Homocysteine metabolism pathway July 28, 2011 DIYgenomics.org Source: Swan, M., Hathaway, K., Hogg, C., McCauley, R., Vollrath, A. Citizen science genomics as a model for crowdsourced preventive medicine research. J Participat Med. 2010 Dec 23; 2:e20.
17.
18.
19. Personal health collaboration studies July 28, 2011 DIYgenomics.org More information: www.DIYgenomics.org www.DIYgenomics.org/DIYgenomics_poster.ppt
20. Agenda July 28, 2011 DIYgenomics.org Citizen science progress to date Scaling citizen science Grand vision next steps Image credit: http://www.gettyimages.com
21.
22.
23. Athletic performance July 28, 2011 DIYgenomics.org Image credit: http://www.istockphoto.com V = number of variants; % = ratio of favorable polymorphisms to total alleles for a sample individual; S = number of studies Source: Swan, M. Applied genomics: personalized interpretation of athletic performance GWAS. Jan 2011. Category Genes V % S Endurance, power, and energy Endurance ACE, ACTN3, ADRB2/ ADRB3, BDKRB2, COL5A1, GNB3 7 50 22 Power ACE, ACTN3, AGT 3 50 8 Energy HIF1A, PPARGC1A 3 25 9 Musculature, and heart and lung capacity Muscle fatigue and repair HNF4A, NAT2 and IL-1B 5 40 4 Strength HFE, HIF1A, IGF1, MSTN GDF8 5 17 15 Heart and lung capacity CREB1, KIF5B, NOS3, NPY and ADRB1, APOE, NRF1 9 36 11 Metabolism, recovery, and other Metabolism AMPD1, APOA1, PPARA, PPARD 5 50 9 Recovery CKMM/CKM, IL6 2 50 5 Ligament and tendon strength Ligament strength COL1A1, COL5A1, CILP 3 50 4 Tendon strength COL1A1, COL5A1, GDF5, MMP3 7 63 5
24. Study design template: MTHFR example July 28, 2011 DIYgenomics.org Source: http://diygenomics.pbworks.com http://diygenomics.pbworks.com/w/file/36469280/DIYgenomics+study+design+template+blank.doc Cyanocobalamin Image credit: http://wikimedia.org
25. DIYgenomics study ecosystem – CRO 2.0 July 28, 2011 DIYgenomics.org Sponsors Funders Study manager Graduate student partner* Study operation platform (Genomera) Study advisors* Participants Oversight (IRB) * Domain expert
26.
27.
28.
29. Agenda July 28, 2011 DIYgenomics.org Citizen science progress to date Scaling citizen science Grand vision next steps Image credit: http://www.gettyimages.com
30.
31.
32.
33.
34.
35.
36. Circles of preventive medicine July 28, 2011 DIYgenomics.org Individual 2. Preventive Care Health Social Networks Citizen Science Studies Health Advisors 3. Traditional health care system and physicians 1. Automated digital health monitoring Source: Extended from Swan, M. Emerging patient-driven health care models: an examination of health social networks, consumer personalized medicine and quantified self-tracking. Int. J. Environ. Res. Public Health 2009 , 2, 492-525.
37. Health self-management July 28, 2011 DIYgenomics.org Source: Extended from Swan, M. Emerging patient-driven health care models: an examination of health social networks, consumer personalized medicine and quantified self-tracking. Int. J. Environ. Res. Public Health 2009 , 2, 492-525, Figure 1.
38. Thank you! Melanie Swan Founder DIYgenomics 650-681-9482 @DIYgenomics www.DIYgenomics.org [email_address] Slides: http://slideshare.net/LaBlogga Creative Commons 3.0 license Collaborators: Lorenzo Albanello Janet Chang Cindy Chen Jon Dekay John Furber Eri Gentry Kristina Hathaway Takashi Kido Laura Klemme Lucymarie Mantese Raymond McCauley Louis Nahum Marat Nepomnyashy Ted Odet Roland Parnaso William Reinhardt Greg Smith Aaron Vollrath Lawrence S. Wong Crowd-sourced clinical trials Personal genome apps