Upcoming SlideShare
×

# Fuzzy math, logic, algorithims ...

250
-1

Published on

Possibility theory is a mathema)cal theory for dealing with certain types of uncertainty and is an alterna)ve to probability theory ... when first introduced possibility was an extension of the theory of fuzzy sets and fuzzy logic, others added to this a proposed min/max algebra to describe degrees of poten)al surprise.

Formalization of possibility
For simplicity, assume that the universe of discourse is a finite set (Ω), and assume that all subsets are measurable. A distribution of possibility is a func)on such that: It follows that, like probability, the possibility measure on finite set is determined by its behavior in singletons (unit X): provided U is finite or infinite.
Axiom 1 can be interpreted as the assump)on that Ω is an exhaus)ve descrip)on of future states of the world,
because it means that no belief weight is given to elements outside Ω.
Axiom 2 could be interpreted as the assump)on that the evidence from which possibilities are constructed, is free of any contradic)on. Technically, it implies that there is at least one element in Ω with possibility 1.
Axiom 3 corresponds to the addivity axiom in probabili)es. However there is an important practical difference. Possibility theory is computa)onally more convenient because Axioms 1–3 imply
Because one can know the possibility of the union from the possibility of each component, it can be said that possibility is composi)onal with respect to the union operator. Note however that it is not composi)onal with respect to the intersec)on operator. Generally:
When Ω is not finite, Axiom 3 can be replaced for all index sets, if the subset variables, are pairwise disjoint.

Necessity
Whereas probability theory uses a single number, the probability, to describe how likely an event is to occur, possibility theory uses two concepts, the possibility and the necessity of the event. For any event X, the necessity measure is defined by the complement of the elements of Ω that do not belong to axiom 3.
Note that contrary to probability theory, possibility is not self-dual. For any event, we only have the inequality: However, the following duality rule holds:
For any event, possibility = / < 1 but > 0, or necessity = 0
Accordingly, beliefs about an event can be represented by a number and a bit. ... Interpretations ...
variable Z is necessary, true. It implies possibility is 1.
variable Z is necessary, and false. It implies possibility is 0.
variable Z is necessary, impossible. It implies possibility is 0.
variable Z is necessary, possible. It implies possibility is 0 - 1.
variable Z is unnecessary, true. It implies possibility is <1 (1 – drag of unnecessary V Z)

Published in: Science, Technology, Education
2 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total Views
250
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
12
0
Likes
2
Embeds 0
No embeds

No notes for slide

### Fuzzy math, logic, algorithims ...

1. 1. Creating Bioscience Algorithms by combining Knowns with … Fluid knowledge, concepts, and entities … Abstracts, Theoreticals, Models Probables, Possibles, Maybes, Dynamics, Impossibles, Unpredictables, Unknowns … For a Biologic Concept or Bioscience Area of Interest In this example … to look at … Stem Cell Biology vs. Time 0 = 2000 Tissue Engineering vs. Time 1 = 2011 Scaffold Biology ALL vs. vs. Time 2 - 2014 3-D Biology vs. Time Frames T0 – T1, T1 – T2, T0 – T2 4-D Biology vs. Predictive Modeling for 2020 trends …
2. 2. P2 P3 – Useful, but detailed explanation, and not needed, if one grasps the bigger picture … the combination of the “suspension of disbelief” with logic and theory and fact and possibility vs probability, with all else unknown or unpredictable thrown in. The trends are the place to start … P4 - 7 – An attempt to simplify without explaining … here, the trends are more important than how they were derived … P8 – 11 Data from the above, graphics, comments, discussions … and predictive modeling … P12 – End … A series of selected figures and data, relevant to this topic, but for review in brief, nothing more, (last slide is slide from 2008)
3. 3. Possibility theory is a mathematical theory for dealing with certain types of uncertainty and is an alternative to probability theory … when first introduced possibility was an extension of the theory of fuzzy sets and fuzzy logic, others added to this a proposed min/max algebra to describe degrees of potential surprise. Formalization of possibility For simplicity, assume that the universe of discourse is a finite set (Ω), and assume that all subsets are measurable. A distribution of possibility is a function such that: It follows that, like probability, the possibility measure on finite set is determined by its behavior in singletons (unit X): provided U is finite or infinite. Axiom 1 can be interpreted as the assumption that Ω is an exhaustive description of future states of the world, because it means that no belief weight is given to elements outside Ω. Axiom 2 could be interpreted as the assumption that the evidence from which possibilities are constructed, is free of any contradiction. Technically, it implies that there is at least one element in Ω with possibility 1. Axiom 3 corresponds to the additivity axiom in probabilities. However there is an important practical difference. Possibility theory is computationally more convenient because Axioms 1–3 imply Because one can know the possibility of the union from the possibility of each component, it can be said that possibility is compositional with respect to the union operator. Note however that it is not compositional with respect to the intersection operator. Generally: When Ω is not finite, Axiom 3 can be replaced for all index sets, if the subset variables, are pairwise disjoint. Necessity Whereas probability theory uses a single number, the probability, to describe how likely an event is to occur, possibility theory uses two concepts, the possibility and the necessity of the event. For any event X, the necessity measure is defined by the complement of the elements of Ω that do not belong to axiom 3. Note that contrary to probability theory, possibility is not self-dual. For any event, we only have the inequality: However, the following duality rule holds: For any event, either possibility = / < 1 but > 0, or necessity = 0
4. 4. Accordingly, beliefs about an event can be represented by a number and a bit. … Interpretations … variable Z is necessary, true. It implies possibility is 1. variable Z is necessary, and false. It implies possibility is 0. variable Z is necessary, impossible. It implies possibility is 0. variable Z is necessary, possible. It implies possibility is 0 - 1. variable Z is unnecessary, true. It implies possibility is <1 (1 – drag of unnecessary V Z) Note that unlike possibility, fuzzy logic is compositional with respect to both the union and the intersection operator. The relationship with fuzzy theory can be explained with the following classical example. Fuzzy logic: When a bottle is half full, it can be said that the level of truth of the proposition "The bottle is full" is 0.5. The word "full" is seen as a fuzzy predicate describing the amount of liquid in the bottle. Possibility theory: There is one bottle, either completely full or totally empty. The proposition "the possibility level that the bottle is full is 0.5" describes a degree of belief. One way to interpret 0.5 in that proposition is to define its meaning as: I am ready to bet that it's empty as long as the odds are even (1:1) or better, and I would not bet at any rate that it's full. Possibility theory as an imprecise probability theory: There is an extensive formal correspondence between probability and possibility theories, where the addition operator corresponds to the maximum operator. The operators of possibility theory can be seen as a hyper-cautious version of the operators of the transferable belief model, a modern development of the theory of evidence. Possibility can be seen as an upper probability: any possibility distribution defines a unique set of admissible probability distributions by This allows one to study possibility theory using the tools of imprecise probabilities. Necessity logic: We call generalized possibility every function satisfying Axiom 1 and Axiom 3. We call generalized necessity the dual of a generalized possibility. The generalized necessities are related with a very simple and interesting fuzzy logic we call necessity logic. In the deduction apparatus of necessity logic the logical axioms are the usual classical tautologies. Also, there is only a fuzzy inference rule extending the usual Modus Ponens. Such a rule says that if α and α → β are proved at degree λ and μ, respectively, then we can assert β at degree min {λ,μ}. It is easy to see that the theories of such a logic are the generalized necessities and that the completely consistent theories coincide with the necessities.
5. 5. An ‘exploratory’ algorithm (as used here) in context … As a step-by-step procedure … this is by definition, outside the box … There is a process and protocol to the paradigm, there is also much extrapolation and assimilation from existing bioinformation, and a lot of the same as fill in the gap activity on the back end as well … creativity and time are needed, for example, comparing a vs b vs c (insert area or variable, scientific process, known biologics, etc.) calculations, data processing, analysis, play in & out of the end point collation theoretical, or automated reasoning, is a tool for formulation and modeling real modeling requires integration of many variables and overlay systems testing and adjusting, modifying, improving in real time, important … Etc, etc, etc, etc, ….. As used here … it is informal, grossly accurate at best, allows information to be derived from information, but any such results, interpretations, conclusions, must be closely scrutinized, challenged. The confounders, variables known and unknown, requirement for mixing both hard and real data with estimates of some data sets as indicated, and a list of qualifiers and caveats that is exponential are considerations ….
6. 6. ‘Exploratory’ Algorithm Building Blocks (ABCs) … for Discussion 6 areas introduced here … 1 Biologic Area of Interest, Topic around which Modeling Occurs … 2-5: +/-/x/div/”Other” manipulations, applied to assigned values … 2&3 Common and / or Derived Variables (routine, simple, logical) 4&5 Variable X factors, and / or Modifiers … etc, etc. 6 Adjusted up / down (tweeked) for random events, caveats, negatives or challenges, confounders … must include all examples thought to be reasonable, and to a reasonable degree … to get a more realistic, accurate model (conservative approach)
7. 7. Units (for discussion) – routine / common vs … Value X Units Common: Distance (base - metric) – Picro, Nano, Micro, Milli, etc. Weight (base – metric) – Picro, Nano, Micro, Milli, Gram, kilogram Speed (base – metric / velocity per Unit of Time) – Kilometer, Meter, Milli, Micron, Nano) Energy, Heat, Kinetic Motion … Volume (base – metric) Liter, ml, microL, nanoL (simple calculation of dV/T) Example of a “Unit” in context, for comparison: Bits and Bytes: ISQ symbols for the bit and byte are bit and B. In the context of data rate units, one byte consists of 8 bits, and is synonymous with the unit octet. The abbreviation bps is often used to mean bit/s, so that when a 1 Mbps connection is advertised, it usually means that the maximum achievable bandwidth is 1 Mbit/s (one million bits per second), which is 0.125 MB/s (megabyte per second), or about 0.1192 MiB/s (mebibyte per second). The IEEE uses the symbol b for bit. Extracting meaning, to the Unit ….”maximum achievable bandwidth is 1 Mbit/s (one million bits per second), which is 0.125 MB/s (megabyte per second), or about 0.1192 MiB/s (mebibyte per second).”
8. 8. Units for Value X, a Pseudo-Tangible Value, for Modeling only Too complex, too fuzzy, to abstract to define here … Unit is arbitrary … comprehensive quality complex and diverse, but logical and common sense more often than not many variables, most values are dynamic parameters or derivations from parameters useful as indicated conceptual, abstract, adapted … as indicated adapted measures, methodology, and units for variables as indicated, etc. Real and logical and theoretical based, all at the same time ~ mixed and matched As used here, it could not be a real world value … although that approach could be used … the sub models could not come to together without a Unit X being created from combining all, adding new values, both pseudo and space filling (i.e. dark matter)
9. 9. UnitX(1U=200) 2000 2011 2014 2020 Graphing results … Value X for 5 Biologic Areas vs. Time (X = Velocity, Acc, GenX vs biotrends) Stem Cell Biology Tissue Engineering Scaffold Biology 3-D Biology 4-D Biology TIME (Years) See next slide
10. 10. 2000 2011 2014 2020 TIME (Years) Values in 2011, exceed the limits of the linear graph (27000+), for SCB and TE, the Numerical values are given for 2014 Demarcates years 2011 and 2014 Time: T0 = 2000, T1 2011(4thQ), T2 = 2014 Time Frame = T0 - T1 vs T1 - T2 vs T0 – T2 ~54000 ~60000 When used as described, this graphic gives 8 Things: 1&2 Value X (~ velocity) for a Time Point or Frame 3&4 Acc at Time A vs B, Frame A vs B. 5 2’ and 3’ data in the data 6 Future Trends, Hypotheses, Models 7 Modeling of generations vs. time 8 Many other things Graphing results … 5 Biologic Areas vs. Time (Value X, Velocity, Acc, GenX vs Biologic vs Time)UnitX(1U=200)
11. 11. Value X = Velocity, Δvelocity ~ Acc, GenX (Complexity Doubling Time) Time Points: T0 = 2000 (or adjusted historically), T1 = 2011, T2 = 2014 Time Frames: T0 - T1 = 2000 - 2011, T0 - T2 = 2000 – 2014, T1 – T2 = 2011 - 2014 ΔvalueX or Δvelocity = Acc, from T0 to T1, vs T1 to T2, vs. T0 – T2 T1 – T2: time of rapid growth, from which present day and near future acceleration is modeled. Growth 200 – 2011 ~ 2012 to Present GenX = doubling of complexity, evolution of the biologic area Δ = delta = change, Y = year, Acc = acceleration
12. 12. Bioscience Δvelocity ~ ValueX (T0 – T2) GenX (T0 – T2 / Y) ΔAcc (T1 – T2, avg) SCB 50X 7 8X/Y TE 50X 7 7.7X/Y SB 20X 4 2.3X/Y 3D 18X 3 1.8X/Y 4D >2X 1.7 1.13X/Y T = time, d = delta = change Value X = Velocity at Time Point, for Time Frame, GenX = Doubling of Technology, Acc = acceleration, at Time Point, for Time Frame SCB Stem Cell Biology TE Tissue Engineering SB Scaffold Biologic 3d Biologics 4d Biologics
13. 13. Thoughts, Observations, Discussion … Stem Cell Biology … did not really become a focus area of research until 2000, huge acceleration in 2005 Leads the way for all biologic areas discussed (if isolated from each other), in 2014 Tissue Engineering … did not really become an area of focus until 2008 did not really become a huge area of research until ~2010 currently 2nd overall in X value for all biologic areas discussed (if isolated from each other) Trends - this area will lead all areas discussed here by 2020 (if isolated from each other) Scaffold Biology … did not really become an area of focused research until 2008 did not really become a huge area of research until ~2010 Currently 3nd overall in X value for all biologic areas discussed (if isolated from each other)
14. 14. 3D Biology… did not really become an area of focused research until 2009 did not really become a huge area of research until ~2011 Currently 4th overall X value for biologic areas discussed (if isolated from each other) Trends suggest will exponentially grow in all areas by 2020 4D Biology… Has yet to become an area of focused research Currently 5th overall in X value of biologic areas discussed (if isolated from each other) Trends suggest this area will exponentially grow in all areas by 2020
15. 15. Thoughts, Observations, Discussion … Stem Cell Biology has undergone 7 generations of Complexity 2000 – 2014, 3 in T1 – T2 Tissue Engineering has undergone 7 generations of Complexity 2000 – 2014 , 3 in T1 – T2 Scaffold Biology has undergone 4 generations of Complexity 2008 – 2014, 2 in T1 – T2 3D Biology has undergone 3 generations of Complexity 2009 – 2014, 2 in T1 – T2 4D has yet to formally define itself … as a Value X, over 2010 – 2014, but trending 4D Biology has undergone <2 generations of Complexity 2009 – 2014, but accelerating
16. 16. Thoughts, Observations, Discussions … All Five areas are accelerating over time … Four areas are moving (trending, directionally …) towards each other … All areas demonstrate huge gains in the last 3 years, vs. the previous 11 years … Total Data Sets, Information, Biotechnology … etc. extensive, beyond this commentary … for Comments, for the Algorithims Most areas demonstrate linear vs nonlinear acceleration early (2000) vs late (2011) in R&D
17. 17. Thoughts, Observations, Discussions … Predictions … ? Example of Trends Seen in the Data … A Look at 2002 Stem Cell Biology  will continue to Evolve Tissue Engineering  will continue to Evolve Scaffold, 3D, 4D Biology  will continue to Evolve Scaffold, 3D, 4D Biology  likely will merge into a comprehensive model, that will later Merge with tissue engineering, into Definitive Tissue Engineering Platform Technologies. SCB   SCB’      Regenerative Medicine Final Common Biotechnologies TE   TE’    SB     Definitive Tissue Engineering Platform Technologies 3D    Comprehensive Tissue Engineering Support Biotechnology, Biologics 4D   
18. 18. Data From Recent Relevant Studies … for Discussion … Monoclonal Adult Stem Cell Sphere Culture from Adult Tissues … Stem Cell Biology … while in Culture … at a Glance … 2/3D Tissue Engineering with Adult Stem Cells (ASCs) In Vitro Tissue Bioreactors for Neotissue Formation from ASCs Genetic Engineering of ASCs for real time Neotissue imaging in vivo Multidisciplinary approaches ~ recreating tissues in humans in vivo Closed System In Vitro, Organ Bioculture System/Tissue Bioreactor 4-D Tissue Engineering of adult stem cells … in vitro, in real time Slide from 2008 I found that puts some relevance to this topic ….
19. 19. SC X SC XPG X PG Y PG Z Founder Stem Cell X from adult tissue X, can divide clonally, to maintain the adult stem cell population, or asymmetrically, to give rise to early (PGX) and late progenitors (PGY/Z), that build the tissue by differentiating into lineage specific tissue cells.
20. 20. 2008 Slide for Discussion
1. #### A particular slide catching your eye?

Clipping is a handy way to collect important slides you want to go back to later.