This document discusses upcoming innovations for the Java programming language. It describes several new projects at IBM that will improve Java, such as projects to add support for native code interoperability (Panama), value types and memory efficiency (Valhalla), new language features (Amber), and asynchronous programming (Loom). It also discusses how Java will continue to evolve more quickly with a predictable release schedule. Finally, it mentions how these changes will help Java stay relevant in an era of cognitive computing and artificial intelligence.
When we wish to peep into the future, we must start with the present. Looking at Java’s future is no different.
Java’s current users are doing great wonders with Java. However, it is very important to appreciate the fast changing needs of users and present to them a pipeline of features and ideas! Some ideas they can experiment with, and others that they can start hypothesizing. This presentation is pretty much about those ideas.
How many use features of Java 8 today? How many have moved to modularity?
There’s the catch. 6-7(5 years) 7-8(3 years) 8-9(3 years)
New release schedule better. Releases gapped at 6 months each. Easier migration. Incremental innovation.
The new numbering system (year.month)
Current LTS is Java 8. LTS releases once in 3 years. Next LTS will be Java 11(18.9)
Who’s used lambdas and streams? And jigsaw or reactive streams?
Heard of the new projects in Java?
How many of used the JNI and developed native code?
Switching between managed and unmanaged runtimes.
JNI is extremely challenging.
Panama needs Valhalla
Primitives and Objects. Data structures have to be objects. They can’t be primitives.
Object header is a big overhead for large arrays of simple data structures.
Example: complex numbers, tuples like a 2D point etc.
A lot of language enhancements.
Type inference was introduced in Java 10. In future, we will have raw string literals that will remove the ”escape-hell”. Pattern matching is a really powerful functional programming construct.
New concurrency model: ForkJoinPool + Continuations
Project Loom proposes light weight threads called fibers. Fibers are managed by the Java runtime scheduler (ForkJoinPool). Continuations are really cool programming constructs that can be paused and resumed. The new concurrency model is going to be running continuations on fibres. You pause when you need something, you continue when you have it.
This async model of IO has been a great success with many languages.
Everything that runs on code, generates data. What’s next? Cognition.
Cognition in simple words is “understanding data to provide meaningful insights”.
Everything that runs on code, generates data. What’s next? Cognition.
Cognition in simple words is “understanding data to provide meaningful insights”.
The problems of the future are different! Are we ready to solve them?
We have modeled a large part of the observable universe. There’s much more left.
Anything, that is modeled, can now have a thinking ability. It can provide you with deep insights. Data is the new oil. Computing has entered a new era – Cognitive era.
Gleaning insights about your customer’s personality.
Finding news that really matters to you.
Products, services and processes that improve themselves.
Intelligent sales.
Offloading to dedicated co-processors.
GPU’s have parallelism baked into them. They can be exploited for certain tasks.
Buiding a neural network to do the search? Finding something with enough confidence.
Deep-learning chip!
Von Neumann machines mimic the left brain. Cognitive architectures mimic the right brain (neural networks).
Mimics the function, real-time, low power, small size of the human brain.
Corolet language.
Quantum algorithms - Grover’s Algorithm. Big Boost to AI.
Caffeine modelling.
We could offload certain computations to a quantum computer.