Nexus of Biology and Computing - a look at how biologically-inspired models are supplementing traditional linear computational methodologies
Audio: http://feeds.feedburner.com/BroaderPerspectivePodcast
Effect of Data Size on Feature Set Using Classification in Health Domaindbpublications
In health domain, the major critical issue is prediction of disease in early stage. Prediction of disease is mainly based on the experience of physician so many machine learning approach contribute their work in the prediction of disease. In existing approaches, either prediction or feature selection has been concentrated. The aim of this paper is to present the effect of data size and set of features in the prediction of disease in health domain using Naïve Bayes. This shows how each attribute or combination of attribute behaves on different size of dataset.
The Architecture of System for Predicting Student Performance based on the Da...Thada Jantakoon
The goals of this study are to develop the architecture of a system for predicting student performance based on data science approaches (SPPS-DSA Architecture) and evaluate the SPPS-DSA Architecture. The research process is divided into two stages: (1) context analysis and (2) development and assessment. The data is analyzed by means of standardized deviations statistically. The research findings suggested that the SPPS-DSA architecture, according to the research findings, consists of three key components: (i) data source, (ii) machine learning methods and attributes, and (iii) data science process. The SPPS-DSA architecture is rated as the highest appropriate overall. Predicting student performance helps educators and students improve their teaching and learning processes. Predicting student performance using various analytical methods is reviewed here. Most researchers used CGPA and internal assessment as data sets. In terms of prediction methods, classification is widely used in educational data science. Researchers most commonly used neural networks and decision trees to predict student performance under classification techniques.
Effect of Data Size on Feature Set Using Classification in Health Domaindbpublications
In health domain, the major critical issue is prediction of disease in early stage. Prediction of disease is mainly based on the experience of physician so many machine learning approach contribute their work in the prediction of disease. In existing approaches, either prediction or feature selection has been concentrated. The aim of this paper is to present the effect of data size and set of features in the prediction of disease in health domain using Naïve Bayes. This shows how each attribute or combination of attribute behaves on different size of dataset.
The Architecture of System for Predicting Student Performance based on the Da...Thada Jantakoon
The goals of this study are to develop the architecture of a system for predicting student performance based on data science approaches (SPPS-DSA Architecture) and evaluate the SPPS-DSA Architecture. The research process is divided into two stages: (1) context analysis and (2) development and assessment. The data is analyzed by means of standardized deviations statistically. The research findings suggested that the SPPS-DSA architecture, according to the research findings, consists of three key components: (i) data source, (ii) machine learning methods and attributes, and (iii) data science process. The SPPS-DSA architecture is rated as the highest appropriate overall. Predicting student performance helps educators and students improve their teaching and learning processes. Predicting student performance using various analytical methods is reviewed here. Most researchers used CGPA and internal assessment as data sets. In terms of prediction methods, classification is widely used in educational data science. Researchers most commonly used neural networks and decision trees to predict student performance under classification techniques.
Project Unity: The Way of the Future for Plant BreedingPhenome Networks
Project Unity is a platform that will host all phenotype-to-genotype public-domain data in a common and unified platform, offered as a free service for academia. Each researcher will be able to load their data and connect it to existing global knowledge, by linking traits to ontology, markers to genetic/physical maps and germplasms to pedigree and their sources. Initially, each dataset is stored privately, and can only be accessed by the researcher comparing his results to public ones. Data is made public once the researcher decides to do so typically after the publication of the corresponding scientific paper.
N. Jimenez_Informática para la salud: la genómica computacional y la medicina...COIICV
Ponencia: Informática para la salud: la genómica computacional y la medicina de precisión. COIICV: X Congreso de la Ingeniería Informática de la Comunidad Valenciana
Neste slide existem citações de personagens e resumo do ambiente e jobabilidade.
In this slides there are characters quotes, gameplay and environment summary
A full picture of -omics cellular networks of regulation brings researchers closer to a realistic and reliable understanding of complex conditions. For more information, please visit: http://tbioinfopb.pine-biotech.com/
"You can be a Polymath: Innovation through Multidisciplinary Thinking" - slid...Anindo Ghosh
Slides and video of talk on "You can be a Polymath: Innovation through Multidisciplinary Thinking", at The Goa Project 2014 (http://thegoaproject.com/)
Surrogate Science: How Fisher, Neyman-Pearson, and Bayes Were Transformed int...jemille6
Gerd Gigerenzer (Director of Max Planck Institute for Human Development, Berlin, Germany) in the PSA 2016 Symposium:Philosophy of Statistics in the Age of Big Data and Replication Crises
Using structures inspired by viruses and
their analogues, virus-like particles,
Crystallized aims to reinterpret the
beauty and evolutionary power of the
virus world within the constraints of
wearable media.
Crystallized explores the structural forms
of biologically protective particles in
vaccines as an analogy for clothing’s
fundamental role as protective architecture.
Project Unity: The Way of the Future for Plant BreedingPhenome Networks
Project Unity is a platform that will host all phenotype-to-genotype public-domain data in a common and unified platform, offered as a free service for academia. Each researcher will be able to load their data and connect it to existing global knowledge, by linking traits to ontology, markers to genetic/physical maps and germplasms to pedigree and their sources. Initially, each dataset is stored privately, and can only be accessed by the researcher comparing his results to public ones. Data is made public once the researcher decides to do so typically after the publication of the corresponding scientific paper.
N. Jimenez_Informática para la salud: la genómica computacional y la medicina...COIICV
Ponencia: Informática para la salud: la genómica computacional y la medicina de precisión. COIICV: X Congreso de la Ingeniería Informática de la Comunidad Valenciana
Neste slide existem citações de personagens e resumo do ambiente e jobabilidade.
In this slides there are characters quotes, gameplay and environment summary
A full picture of -omics cellular networks of regulation brings researchers closer to a realistic and reliable understanding of complex conditions. For more information, please visit: http://tbioinfopb.pine-biotech.com/
"You can be a Polymath: Innovation through Multidisciplinary Thinking" - slid...Anindo Ghosh
Slides and video of talk on "You can be a Polymath: Innovation through Multidisciplinary Thinking", at The Goa Project 2014 (http://thegoaproject.com/)
Surrogate Science: How Fisher, Neyman-Pearson, and Bayes Were Transformed int...jemille6
Gerd Gigerenzer (Director of Max Planck Institute for Human Development, Berlin, Germany) in the PSA 2016 Symposium:Philosophy of Statistics in the Age of Big Data and Replication Crises
Using structures inspired by viruses and
their analogues, virus-like particles,
Crystallized aims to reinterpret the
beauty and evolutionary power of the
virus world within the constraints of
wearable media.
Crystallized explores the structural forms
of biologically protective particles in
vaccines as an analogy for clothing’s
fundamental role as protective architecture.
Knowledge Management in the AI Driven Scintific SystemSubhasis Dasgupta
In this dynamic talk, we'll explore the transformative role of AI in scientific knowledge management. We'll delve into how AI revolutionizes data organization, analysis, and hypothesis testing, enhancing efficiency and discovery. Highlighting the seamless integration with existing research processes, we'll address the training and ethical considerations of AI adoption. Through real-world examples, we'll demonstrate AI's impact on scientific breakthroughs, emphasizing the shift towards more collaborative and innovative research landscapes. This presentation aims to inspire the scientific community to embrace AI, leveraging its potential to redefine the boundaries of knowledge and innovation.
Supervised Multi Attribute Gene Manipulation For Cancerpaperpublications3
Abstract: Data mining, the extraction of hidden predictive information from large databases, is a powerful new technology with great potential to help companies focus on the most important information in their data warehouses. Data mining tools predict future trends and behaviours, allowing businesses to make proactive, knowledge-driven decisions. The automated, prospective analyses offered by data mining move beyond the analyses of past events provided by retrospective tools typical of decision support systems.
They scour databases for hidden patterns, finding predictive information that experts may miss because it lies outside their expectations. Data mining techniques are the result of a long process of research and product development. This evolution began when business data was first stored on computers, continued with improvements in data access, and more recently, generated technologies that allow users to navigate through their data in real time. Data mining takes this evolutionary process beyond retrospective data access and navigation to prospective and proactive information delivery.
Designing Interactive Visualisations to Solve Analytical Problems in BiologyCagatay Turkay
Slides for my talk for the Cambridge Visualization of Biological Information Meetup held January 2015. I talk about why biology is exciting for visualisation researchers and go through examples where visualisation can help experts in understanding their data.
Excited to share our vision for bioinformatics education available for students and researchers that want to apply advanced multi-omics integration and machine learning to large biomedical datasets. Practice and learn from real-life projects.
With the surge in modern research focus towards Pervasive Computing, lot of techniques and challenges
needs to be addressed so as to effectively create smart spaces and achieve miniaturization. In the process of
scaling down to compact devices, the real things to ponder upon are the Information Retrieval challenges.
In this work, we discuss the aspects of multimedia which makes information access challenging. An
Example Pattern Recognition scenario is presented and the mathematical techniques that can be used to
model uncertainty are also presented for developing a system that can sense, compute and communicate in
a way that can make human life easy with smart objects assisting from around his surroundings.
Towards Automatic Composition of Multicomponent Predictive SystemsManuel Martín
Automatic composition and parametrisation of multicomponent predictive systems (MCPSs) consisting of chains of data transformation steps is a challenging task. In this paper we propose and describe an extension to the Auto-WEKA software which now allows to compose and optimise such flexible MCPSs by using a sequence of WEKA methods. In the experimental analysis we focus on examining the impact of significantly extending the search space by incorporating additional hyperparameters of the models, on the quality of the found solutions. In a range of extensive experiments three different optimisation strategies are used to automatically compose MCPSs on 21 publicly available datasets. A comparison with previous work indicates that extending the search space improves the classification accuracy in the majority of the cases. The diversity of the found MCPSs are also an indication that fully and automatically exploiting different combinations of data cleaning and preprocessing techniques is possible and highly beneficial for different predictive models. This can have a big impact on high quality predictive models development, maintenance and scalability aspects needed in modern application and deployment scenarios.
Study on Cyber Security:Establishing a Sustainable Cyber Security Framework f...Rihab Rahman
The research proposal is about my current research project titled "Study on Cyber Security: Establishing a Sustainable Information Security Framework for University Automation System"
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
A Big Picture in Research Data ManagementCarole Goble
A personal view of the big picture in Research Data Management, given at GFBio - de.NBI Summer School 2018 Riding the Data Life Cycle! Braunschweig Integrated Centre of Systems Biology (BRICS), 03 - 07 September 2018
AI Health Agents: Longevity as a Service in the Web3 GenAI Quantum RevolutionMelanie Swan
Health Agents are a form of Math Agent as the concept of a personalized AI health advisor delivering “healthcare by app” instead of “sickcare by appointment.” Mobile devices
can check health 1000 times per minute as opposed to the standard one time per year doctor’s office visit, and model virtual patients in the digital twin app. As any AI agent, Health Agents “speak” natural language to humans and formal language to the computational infrastructure, possibly outputting the mathematics of personalized homeostatic health as part of their operation. Health Agents could facilitate the ability of physicians to oversee the health of thousands of individuals at a time. This could ease overstressed healthcare systems and contribute to physician well-being and the situation that (per the World Health Organization) more than half of the global population is still not covered by essential health services.
The computational infrastructure is becoming a vast interconnected fabric of formal methods, including per a major shift from 2d grids to 3d graphs in machine learning architectures
The implication is systems-level digital science at unprecedented scale for discovery in a diverse range of scientific disciplines
We know that we are in an AI take-off, what is new is that we are in a math take-off. A math take-off is using math as a formal language, beyond the human-facing math-as-math use case, for AI to interface with the computational infrastructure. The message of generative AI and LLMs (large language models like GPT) is not that they speak natural language to humans, but that they speak formal languages (programmatic code, mathematics, physics) to the computational infrastructure, implying the ability to create a much larger problem-solving apparatus for humanity-benefitting applications in biology, energy, and space science, however not without risk.
This work introduces “quantum intelligence” as a concept of intelligence for operating in the quantum realm may help in a potential AI-Quantum Computing convergence (~2030e), and towards the realization of SRAI for well-being (economics, health, energy, space). “Scale-free intelligence” is formulated as a generic capacity for learning.
AI did not spring onto the scene with chatGPT, but is in an ongoing multi-year adoption. A transition may be underway from an information society to a knowledge society (one tempered and specifically using knowledge to improve the human condition). AI is a dual-use technology with both significant risk and upleveling possibilities.
SRAI for well-being is a social objective, and also a technological objective. SRAI is part of AI development and within the technological trajectory of harnessing all scales of physical reality ranging from quantum materials to space exploration.
Conceptually, thinking in quantum and relativistic terms expands the physical worldview, and likewise the social worldview of entities inhabiting the larger world. Practically, SRAI may be realized in phases: short-term regulation and registries, medium-term agents learning to implement human values with internal reward functions, and long-term responsible human-AI entities acting in partnership in a future of SRAI for well-being.
The Human-AI Odyssey: Homerian Aspirations towards Non-labor IdentityMelanie Swan
The visionary progression in The Odyssey from shipbuilding to seafaring to advanced civilization informs contemporary tension in the human-AI relation forcing a broader articulation of human-identity beyond labor-identity. Edith Hall analyzes why one of the earliest known literatures, The Odyssey, remains a central cultural trope with numerous references in the storytelling vernacular of all eras, ranging from 1860s British theater to a highly-watched 1990 episode of The Simpsons. The argument is that The Odyssey provides a constant aspirational reference for human identity – who we think we are and where we are going on the epic journey of life, especially at the current crossroad in our relationship with technology.
The contemporary moment finds humanity, and the humanities, experiencing an identity crisis in the relationship with technology. Information science is having an ever more pervasive role in academia, and the machine economy continues to offload vast classes of tasks to labor-saving technology giving rise to two questions. First, at the level of labor-identity, humans wonder who they are as they have long defined their sense of self through their professional participation in the economy. Second, at the level of human-identity, with AI now performing cognitive labor in addition to physical labor, humans wonder if there is anything that remains uniquely human.
The effect of The Odyssey is to provide world-expanding imaginaries to change the way we see ourselves as subjects; in this way, Homer is an early modernist in reconfiguring our self-concept.
This work applies a philosophy (of literature)-aided information science method to discuss how Homer’s Odyssey persists as a literary imaginary to help us think through potential futures of human-AI flourishing as rapid automation continues to impact humanity. The intensity of the human-AI relation is likely to increase, which invites thought leadership to steward the transition to a potential AI abundance economy with fulfilling human-technology collaboration.
The shipbuilding-seafaring-advanced civilization progression in The Odyssey identifies that the human-AI relation is not one of the labor-identity-crisis of “robots stealing our jobs,” but rather one of the more difficult challenge of envisioning who we can be in the new larger world of human-AI partnership addressing a larger set of planetary-scale problems. Towards this new configuration of human-AI relation, the longer-term may hold radically different notions of identity, as we become physical-virtual hybrids, augmented post-disease entities in the health-faring, space-civilizing, energy-marshalling post-scarcity cultures of the future.
AdS Biology and Quantum Information ScienceMelanie Swan
Quantum Information Science is a fast-growing discipline advancing many areas of science such as cryptography, chemistry, finance, space science, and biology. In particular AdS/Biology, an interpretation of the AdS/CFT correspondence in biological systems, is showing promise in new biophysical mathematical models of topology (Chern-Simons (solvable QFT), knotting, and compaction). For example, one model of neurodegenerative disease takes a topological view of protein buildup (AB plaques and tau tangles in Alzheimer’s disease, alpha-synuclein in Parkinson’s disease, TDP-43 in ALS). AdS/Neuroscience methods are implicated in integrating multiscalar systems with different bulk-boundary space-time regimes (e.g. oncology tumors, fMRI + EEG imaging), entanglement (correlation) renormalization across scales (MERA, random tensor networks, melonic diagrams), entropy (possible system states), entanglement entropy (interrelated fluctuations and correlations across system tiers), and non-ergodicity (implied efficiency mechanisms since biology does not cycle through all possible configurations per temperature (thermotaxis), chemotaxis, and energy cues); Maxwell’s demon of biology (partition functions), conservation across system scales (biophysical gauge symmetry (system-wide conserved quantity)), and the presence of codes (DNA, codons, neural codes). A multiscalar AdS/CFT correspondence is mobilized in 4-tier ecosystem models (light-plankton-krill-whale and ion-synapse-neuron-network (AdS/Brain)).
Humanity’s constant project is expanding the range of attainable geography. Melville’s romance of the sea gives way to Kerouac’s romance of the road, and now the romance of space. In expanding into new geographies, markets (commerce) is the driving impulse, entailing a legal and judiciary system to order the new larger continuous marketplace, which brings a bigger overall scope of world under our control, and hence a new idea of who we are as subjects in this bigger domain.
Space Humanism is a concept of humanism based on the principles of inclusion, progress, and equity posited as a condition of possibility for a potential large-scale human movement into space. A philosophy of literature approach is used to contextualize Space Humanism, first through Melville-Foucault to articulate the mind-frame of extra-planetary geographies as one of human expansion, and second through posthuman philosophy extending from Shakespeare’s Renaissance humanism to contemporary enhancement-based theories of subjectivation.
Historical imaginaries outline subjectivation moments that have changed the whole notion who we are as humanity. Four examples are: the concept of the “new world” in Hegel’s philosophy, von Humboldt’s infographic maps, Baudelaire as the Painter of Modern Life, and Keats’s seeing the world in a new way upon reading an updated translation of Homer.
The reach to beyond-Earth geographies is a two-cultures project involving both arts and science. Technical competence is necessary to realize the aspirational, explorational, and survivalist aims of humanity pushing beyond planetary limits. Space was once a fantastic dream that is becoming quotidian with fourteen U.S. spaceports, six completed Blue Origin space tourist missions, and SpaceX having over 155 successful rocket launches including human space flights to and from the International Space Station. The notion of Space Human articulated through Shakespeare, Moby-Dick, and neuroenhancement informs the project of our reach to awaiting beyond-Earth geographies.
Quantum Information Science and Quantum Neuroscience.pptMelanie Swan
Mathematical advance in quantum information science is proceeding quickly and applies to many fields, particularly the complexities of neuroscience (here focusing on image-readable physical behaviors such as neural signaling, as opposed to higher-order operations of cognition, memory, and attention). Quantum mathematical models are extensible to neuroscience problem classes treating dynamical time series, diffusion, and renormalization in multiscalar systems. Approaches first reconstruct wavefunctions observed in EEG and fMRI scans. Second, single-neuron models (Hodgkin-Huxley, integrate-and-fire, theta neurons) and collective neuron models (neural field theories, Kuramoto oscillators) are employed to model empirical data. Third, genome physics is used to study time series sequence prediction in DNA, RNA, and proteins based on 3d+ complex geometry involving fields, curvature, knotting, and information compaction. Finally, quantum neuroscience physics is applied in AdS/Brain modeling, Chern-Simons biology (topological invariance), neuronal gauge theories, network neuroscience, and the chaotic dynamics of bifurcation and bistability (to explain epileptic and resting states). The potential benefit of this work is an improved understanding of disease and pathology resolution in humans.
Quantum information science enables a new tier of scientific problem-solving as exemplified in early-adopter fields, foundational tools in quantum cryptography, quantum machine learning, and quantum chemistry (molecular quantum mechanics), and advanced applications in quantum space science, quantum finance, and quantum biology
Grammatology and Performativity: A Critical Theory of Silence: Silence is a crucial device for subversion, opposition, and socio-political commentary, the theoretical underpinnings of which are just starting to be understood. This work illuminates another position in the growing field of critical silence studies, theorizing silence as an asset whose ontological value has been lost in a world of literal and figurative noise. Part 1 philosophizes silence as a continuation of Derrida’s grammatology project. Such a grammatology of silence valorizes silent thinking over noisy speaking, and identifies the deconstructive binary pairing not as silence-speaking, but rather as silence-noise. Noise has a simultaneous physical-virtual existence as Shannon entropy calculates signal-to-noise ratios in modern communications networks. Part 2 employs the philosophy of noise to assess what is conceptually necessary to overcome noise in a critical theory of silence. Malaspina draws from Simondon to argue that noise is a form of individuation, essentially a living thing with unstoppable growth potential, not defined by a binary on-off switch but as a matter of gradation. Hence different theory resources are required to oppose it. Part 3 then develops a critical theory of silence to oppose noise in both its physical and virtual instantiations, with the two arms of a deeply human positive performativity (Szendy, Bennett) and a beyond-computational posthumanism (Puar). The result is a novel critical theory of silence as positive performativity that destabilizes noise and recoups the ontological status of silence as not merely an empty post-modern reification but a meaningful actuality.
Philosophy-aided Physics at the Boundary of Quantum-Classical Reality The philosophical themes of truth-knowledge and appearance-reality are used to interrogate the contemporary situation of the quantum-classical boundary, and more broadly the quantum-classical-relativistic stratification of physical scale boundaries. The contemporary moment finds us at breakneck pace in the industrial information revolution, digitizing remaining matter-based industries into a seamless exchange between physical-digital reality. Digitized news is giving way to digitized money and perhaps in the farther future, digitized mindfiles (such as personalized connectome files for precision medicine, autologous (own-DNA) stem cell therapies, and CRISPR for Alzheimer’s disease prevention). Our technologies are allowing us control over vast new domains, the relativistic with GPS and space-faring, and the quantum with quantum computing, harnessing the properties of superposition, entanglement, and interference. Philosophy provides critical thinking tools that can help us understand and master these rapid shifts in science and technology to avoid an Adornian instrumental reality (subsuming humanity under societal structures) and to maintain a Heideggerian backgrounded and enabling relation with technology (versus technology enframing us into mindless standing reserve).
The philosophical theme underlying the investigation of the scales of planets, persons, and particles is the relationship between truth and knowledge (or appearance and reality). The truth-knowledge problem is whether knowledge of the truth, true knowledge, the reality under the appearance, is even possible. Three salient moments in the history of the truth-knowledge problem are examined here. These are the German idealism of Kant and Hegel, the deconstructive postmodernism of Foucault and Derrida, and the unclear leanings of the current moment. The German idealism lens incorporates the self-knowing subject as agent into the truth and knowledge problem. The postmodernist view breaks with the subject and emphasizes the hidden opposites in the formulations, the constant reinterpretation of meaning, and porous boundaries. The contemporary moment wonders whether truth-knowledge boundaries still hold, in a Benjaminian view of non-identity between truth and knowledge, and truth increasingly being seen as a Foucauldian biopolitical manufactured quantity. Contemporaneity has a bimodal distribution of the subject: the hyperself (the constantly digitally represented selfie self) and the alienated post-subject subject.
These moments in the truth and knowledge debate inflect into the scale considerations of relativity, classicality, and quantum mechanics. Whereas general relativity and quantum mechanics are domains of universality, totality, and multiplicity, everyday classical reality is squeezed in as a belt between the two multiplicities as the concretion of drawing a triangle or tossing a ball. Recasting truth and k
Comprehensive philosophical programs arise within a historical context (for Hegel and Derrida in the democracy-shaping moments of the French Revolution (1789) and the student-worker protests (1968) in which French politics serve as a global harbinger of contemporary themes). In the Derrida-Hegel relationship, there is more rapprochement concerning core notions of difference, history, and meaning-assignation than may have been realized. In particular, Hegel’s philosophy, despite being assumed to be a totalizing system, in fact indicates precisely some of the same kinds of revised metaphysics-of-presence formulations that Derrida exhorts, namely those that are flexible, expansive, and include non-identity and identity.
A crucial Derrida-Hegel interchange is that of différance and difference. Derrida develops the notion directly from Hegel (“Différance,” “The Pit and the Pyramid”), but only draws from the Encyclopedia, not Hegel’s masterwork, the Phenomenology of Spirit. For Derrida, the “A” in différance is inspired by the form of the pyramid in the capitalized letter and in Hegel’s comparing the sign “to the Egyptian Pyramid” (“Différance,” p. 3). Derrida invokes the symbolism of the pyramid, antiquity, and Egyptian hieroglyphics as an early semiotic system. However, when considering Hegel’s central definition of difference in the dialectical progression of thesis-antithesis-synthesis in the Phenomenology of Spirit (§§159-163), the articulations of différance and difference are remarkably aligned.
Parallel formulations are also seen in history as a series of reinterpretable events, and indexical wrappers as a mechanism for meaning assignation. The thinkers examine the universal and the particular by exploring regulative mechanisms such as law (natural and social). In Glas, Derrida highlights not the singular-universal relation, but the law of singularity and the law of universality relation as being relevant to Hegel’s Antigone interpretation (Glas, p. 142a), a theme continued in “Before the Law.” Finally (time permitting), there is a question whether the most valid critiques of Hegel (Nietzsche’s unreason and Benjamin’s non-synthesis), as alternatives to Hegelian dialectics, are visible in Derrida’s thought.
The upshot is that the two thinkers produce similar formulations, derived from different trajectories of philosophical work; a situation which points to the potential universality of fundamental solution classes to open-ended philosophical problems, including the future of democracy.
Quantum Moreness: Kantian Time and the Performative Economics of Multiplicity
There is no domain with greater moreness than that of the quantum. A philosophy-aided physics approach (postmodernism and Continental philosophy) examines the contemporary situation of quantum moreness (more time and space dimensions than are available classically). Quantum moreness is configured by quantum reality being probabilistic; a multiplicity of outcomes all co-existing in superposition until collapsed in measurement. The quantum mindset uses quantum moreness to solve problems by thinking in terms of the greater scalability afforded in time and space with the quantum properties of superposition, entanglement, and interference. Quantum studies fields proliferate in arts and sciences, raising the Levi-Straussian raw-cooked dilemma of how “traditional humanities” are to be named alongside “digital humanities” and “quantum humanities.” Kant facilitates the conceptualization of quantum moreness by insisting on the dual nature of time as transcendentally ideal and empirically real. Kant’s moreness is allness, the absolute totality and multiplicity of time at the ideal level. Each faculty (sensibility, understanding, reason) has its own species of the a priori synthetic unity of ideal time that precedes and conditions the operation of the faculty. Each faculty also has a concretized formulation of empirically-real time as the time series, which is the basis for the faculties to interoperate to perform the conception of any empirical object. Kant’s achievement of time interoperability has potential extensibility to other areas of temporal incompatibility such as the scales of general relativity, Newtonian mechanics (human-scale), and quantum mechanics. The quantum moreness mindset with which Kant connects the ideal-real is visible in the domain of economics, itself too an ideal-real construction. The quantum moreness of money configures the postmodern abstraction of global cryptocurrencies and smart contract pledges, the implicative hope of which is a post-debt capital world that restores the human esprit in the face of an increasingly intense technologized reality.
Blockchain Crypto Jamming: Subverting the Instrumental Economy
The ultimate subversion is money, refusing the pecuniary resources of the state. This project applies a philosophical and critical theory lens to examine the use of nomenclature in one of the most radical longitudinal transformations in contemporary times, the shift away from state-run monetary resources towards cryptocurrencies and smart contracts in citizen-determined decentralized financial networks.
A Cryptoeconomic Theory of Social Change is presented in which linguistic progression serves as a tracking mechanism. The steps to lasting change have their own vocabulary (Brandom). First, there is the social critique, the complaint about what is wrong, the negative side (Adorno and Horkheimer highlight instrumental reason and the empty culture industry). Second, there is the antidote, an alternative that can overcome the complaint, the positive side. Third, the solution becomes the new reality, and as a consequence, the whole of reality is now seen in this context, adopting its vocabulary (“fiat health” system for example, referring to the antiquated method). The social movement graduates from language game (Wittgenstein) to form of life (Jaeggi).
Blockchains are Occupy with teeth, notable in the level of personal responsibility-taking by individuals to steward their own financial resources. The crypto citizen is not merely trading CryptoKitties and Bored Ape Yacht Club tokens, but getting blocktime loans through DeFi liquidity pools instead of fiat banks, earning labor income in crypto, and shifting all economic activity to blockchain networks. The artworld signals mainstream acceptance with Christie’s non-fungible token digital artwork auctioned from Beeple for $61 million. At the global level, coin communities constitute a new form of Kardashev-level (planetary-scale) democracy. Blockchains emerge as a robust smart network automation technology for super-class projects ranging from space-faring to quantum computing and thought-tokening. The further stakes of this work are having a language-based theory of social change with broad applicability to social transformation.
This work argues that the emerging understanding of time in quantum information science can be articulated as a philosophical theory of change. Change and time are interrelated, and one can be used to interrogate the other, namely, a theory of change can be derived from a theory of time. What is new in quantum science is time being regarded as just another property to be engineered. At the quantum scale, time is reversible in certain ways, which is quite different from the everyday experience of time whose unidirectional arrow does not allow a dropped egg to reassemble. At the quantum scale of atoms, though, a particle retains the history of its trajectory, which may be retraced before collapsed in measurement.
Quantum scientists evolve systems backward and forward in time, controlling phase transitions with Floquet engineering. Quantum systems are entangled in time and space, with temporal correlations exhibiting greater multiplicity than spatial correlations. The chaotic time regimes of ballistic spread followed by saturation are implemented in quantum walks for faster search and heightened cryptosecurity. In quantum neuroscience, seizure may be explained by chaotic dynamics and normal resting state by Floquet-like periodic cycles. Time is revealed to have the same kinds of repeating structures as space (described by entanglement, symmetry, and topology), differently instantiated and controlled.
The quantum understanding of time can be propelled into a macroscale-theory of change through its connotation of a more flexible, malleable, probabilistic interface with reality. Change becomes less rigid. Probability is the lever of change, but notoriously difficult for humans to grasp, as we think better in storylines than statistics. The idea of manipulating quantum system properties in which time, space, dynamics (change), are all just parameters, is an empowering frame for the acceptance of change. The quantum mindset affords greater facility with probability-driven events (change).
Blockchains in Space: Non-Euclidean Spacetime and Tokenized Thinking - Two requirements for the large-scale beyond-terrestrial expansion of human intelligence into the universe are the ability to operate in diverse spatiotemporal regimes and to instantiate thinking in various formats. Newtonian mechanics describe everyday reality, but Einsteinian physics is needed for GPS and the orbital technologies of telescopes and spacecraft. Space agencies already integrate the Earth-day and the slightly-longer Martian-sol. A more substantial move into space requires facility with non-Euclidean spacetimes. One challenge is that general relativity and quantum mechanics are non-interoperable. However, the theories can be formulated together when considering black holes and quantum computing since geometric theories and gauge theories are both field-based. Quantum blockchains instantiate blockchain logic in quantum computational environments. Blockchains have their own temporal regime (blocktime: the number of blocks for an event to occur), and hence quantum blocktime is a non-classical functionality for operating in diverse spatiotemporal regimes. Thinking is a rule-based activity that is unrestricted by medium. Central to thinking is concepts, which are referenced by words. Word-types include universals, particulars, and indexicals which can be encoded into a formal system as thought-tokens, and registered to blockchains. Blockchains are contemplated as an automation technology for asteroid mining and space settlement construction, and thought-tokening adds an intelligence layer. Time and tokenized thinking come together in the idea of smart networks in space. In blockchain quantum smart networks, spatiotemporal regimes and thought-tokens are simply different value types (asset classes) coordinated with blockchain logic, towards the aim of extending human capabilities into the farther reaches of space.
Cryptography, entanglement, and quantum blocktime: Quantum computing offers a more scalable energy-efficient platform than classical computing and supercomputing, and corresponds more naturally to the three-dimensional structure of atomic reality. Blockchains are a decentralized digital economic system made possible by the 24-7 global nature of the internet.
Quantum Neuroscience: CRISPR for Alzheimer’s, Connectomes & Quantum BCIsMelanie Swan
This talk provides an introduction to quantum computing and how it may be deployed to study the human brain and its diseases of pathology and aging. Refined to its present state over centuries, the brain is one of the most complex systems known, with 86 billion neurons and 242 trillion synapses connected in intricate patterns and rewired by synaptic plasticity. Research continues to illuminate the mysteries of the brain. Quantum computing provides a more capacious architecture with greater scalability and energy efficiency than current methods of classical computing and supercomputing, and more naturally corresponds to the three-dimensional structure of atomic reality. The vision for quantum neuroscience is to model the nature of the brain exactly as it is, in three-dimensional atomically-accurate representations. Neuroscience (particularly genetic disease modeling, connectomics, and synaptomics) could be the “killer application” of quantum computing. Implementations in other industries are also important, including in quantum finance, quantum cryptography using Shor’s factoring algorithm (“the Y2K of Crypto”), Grover’s search, quantum chemistry, eigensolvers, quantum machine learning, and continuous-time quantum walks. Quantum computing is a high-profile worldwide scientific endeavor with platforms currently available via cloud services (IBM Q 27-qubit, IonQ 32-qubit, Rigetti 19Q Acorn) and is in the process of being applied in various industries including computational neuroscience.
Art Theory: Two Cultures Synthesis of Art and ScienceMelanie Swan
Thesis: Aesthetic resources contribute broadly to the human endeavor of progress, self-understanding, and science, beyond the immediate experience of art. Aesthetic Resources are frameworks, concepts, and modes of expression in art, literature, and philosophy that capture the imagination and the intellect through the senses. The role of art is to inspire the future: the romance of the sea, the open road, space.
The arts are a hallmark of civilization, but can their benefit be crystallized as aesthetic resources that can be mobilized to new situations? How can aesthetic resources help in moments of crisis?
A worldwide social identity crisis has been provoked by pandemic recovery, politics, equity, and environmental sustainability. Philosophical and aesthetic resources can help. Understanding art as a reflection of who we are as individuals and groups, this talk explores conceptualizations of art, with examples, in different periodizations from the 1800s to the present. A marquis definition as to what constitutes an artwork is Adorno’s, for whom the work must promulgate its own natural law and engage in novel materials manipulation. For many theorists, art is the pressing of our self-concept into concrete materiality (whether pyramids, sculpture, or painting). What do contemporary periodizations of art mean to our current and forward-looking self-concept? Recent eras include the neo-avant-gardes of 1945, the conceptual art of the 1960s, and post-conceptual art starting in the 1970s, produced generatively with found materials, the digital domain, and audience interactivity. What is the now-current idea of art? Is today’s Baudelairian flâneur and Balzacian modern hero incarnated in the quantum aesthetic imaginary and the digital cryptocitizen? Far from an “end of art” thesis sometimes attributed to Hegel, aesthetic practices are more relevant than ever. Individually and societally, we are reinventing creative energy and productive imagination in venues from science, technology, health, and biology to the arts.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Nexus of Biology and Computing
1. The nexus of biology and computing
Small scale and complexity are forcing
advances in computational methodologies
Melanie Swan, Futurist
MS Futures Group
melanie@melanieswan.com
http://www.melanieswan.com
http://futurememes.blogspot.com
BCIG NIH
May 24, 2007
2. BCIG May 24, 2007
2
Educational background:
BA French & Economics, Georgetown University
MBA Finance & Accounting, Wharton, Univ. of Pennsylvania
Current course work in Physics & Computer Science
Professional experience
Futurist: speaker, researcher, business advisor
Hedge Fund Manager: Wall Street, proprietary
Current projects
OpenBasicResearch.org
del.icio.us for people
Issues in running Historical Simulations
Interests: science fiction, travel
Bio – Melanie Swan
3. BCIG May 24, 2007
3
1. Approaches to computation – approaches of parallelism
2. Architecture – modularity, simplicity and ubiquity of structure
3. Goals – broadly defined objectives to drive higher value results
4. Modulation mechanisms – information modulation
5. Prediction mechanisms – probabilistic models
6. Unconscious processing – unobtrusiveness computing
7. Multidisciplinarity – adjacent discipline integration
Summary: Seven principles suggest future
advances in computational methodologies
4. BCIG May 24, 2007
4
Traditional: Von Neumann
Linear
Current and future: non-Von Neumann
Cellular, tissue, systemic, holistic focus
Parallelism and multicores in hardware and software
DNA computing
Quantum computing
Genetic computing
Evo-devo: blend of bottom up emergence / top down design
Suggests biological and other approaches facilitating
parallelism are required for molecular scale computing
1. Approaches to computation
5. BCIG May 24, 2007
5
Conservation
Across simple and complex organisms
Across processes within one organism
Across time, evolution
Structure
Same loose administrative over-structures, diverse applications
Redundancy in architecture and process
Massively distributed individual agents
Suggests modularity, simplicity and ubiquity of
underlying structure
2. Architecture
6. BCIG May 24, 2007
6
Suggests more broadly defined objectives drive higher
value results
3. Goals
Systemic, holistic Traditional, singular
Clusters of functionality,
capability, redundancy
Loose process, many
outcomes
Service paradigm
Focus on obtaining
useful information
One precise goal or
outcome
Tightly directed process
coupled to outcome
Task paradigm
Exclusive focus on THE
solution
7. BCIG May 24, 2007
7
Short and long-term memory:
An implemented evaluation of the importance of information
Brain automatically modulates importance
Computing can better modulate information with
attributes signaling relevance, value, accuracy, etc.
Repetition, time-based algorithms
Web 2.0 marks relevance and importance
Scientific Research 2.0 – digg for PubMed, RSS peer feeds,
collaborative research paper commenting and annotation
Suggests much higher levels of information modulation
with relevance attributes
4. Modulation mechanisms
8. BCIG May 24, 2007
8
Prediction is a strong biological mechanism
Explosion in predictive, probabilistic, statistical,
Bayesian papers and applications
Numenta
Google
Key parameters of successful probabilistic model
implementation
Large data corpus
Abstraction processes
Suggests greater development and application of
probabilistic models
5. Prediction mechanisms
9. BCIG May 24, 2007
9
Brain processes mainly unconsciously
Some computer processing is “unconscious”
AI, virus scans, ajax websites
Other computer processing is very obvious
Memory, processing, storage
Heat, power, battery
Connectivity
Processing will become less conscious
Wearables, pen computing, visualization, simulation
Ubiquitous embedded chips, sensors, connectivity
Suggests a focus on less obtrusiveness computing
6. Unconscious processing
10. BCIG May 24, 2007
10
Cross-field collaboration and new area definition
Molecular cognition, molecular science of behavior
Systems biology
Quantitative measurement and mathematical analysis
Systems level studies: focus on quantitative aspects and
interactions among elements
Need to standardize: an eigenvalue by any other name
Multidisciplinary cataloging of all biological information
E.O. Wilson Encyclopedia of Life
Suggests greater integration of adjacent disciplines in
pursuit of open research questions
7. Multidisciplinarity
11. BCIG May 24, 2007
11
1. Approaches to computation – approaches of parallelism
2. Architecture – modularity, simplicity and ubiquity of structure
3. Goals – broadly defined objectives to drive higher value results
4. Modulation mechanisms – information modulation
5. Prediction mechanisms – probabilistic models
6. Unconscious processing – unobtrusiveness computing
7. Multidisciplinarity
Summary: Seven principles suggest future
advances in computational methodologies
– adjacent discipline integration
12. Thank you Melanie Swan, Futurist
MS Futures Group
melanie@melanieswan.com
http://www.melanieswan.com
http://futurememes.blogspot.com