Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Disruptive Application Development Technologies Of The Decade


Published on

Disruptive Application Development Technologies of the Decade 1999-2009

Published in: Technology, Education
  • Before jUnit there was no automatic testing? Are you kidding? REST, XML disruptive? Geeze...
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Disruptive Application Development Technologies Of The Decade

  1. 1. Disruptive Application Development Technologies of the Decade Richard Watson Analyst Burton Group [email_address] Twitter: @richwatson
  2. 2. Disruptor #1: Spring Framework Disruptor Spring Framework What did it disrupt? Java Enterprise Edition (JEE [nee J2EE]) Why did it disrupt? JEE adoption collapsed under the weight of its own complexity. The Spring Framework was less “standards-based”, but simpler to use, easier to deploy and test with, and being open source, developers could read (and debug) the code. What is its legacy? / What can we learn? The Dependency Injection (DI) and Inversion of Control (IOC) patterns at the heart of Spring remain the dominant application infrastructure patterns. The application design patterns popularized Spring, along with OSGi are enabling technologies for the next generation application platform or the “ Stackless stack ” (sogrady). SpringSource (nee Interface21) is an example of a company with a successful open source business model. Who deserves the credit? Firstly, Rod Johnson, and subsequently his fellow contributors.
  3. 3. Disruptor #2: Ruby on Rails Disruptor Ruby on Rails What did it disrupt? Java, .Net application platforms and the Java and C# static language status quo. Why did it disrupt? Ruby on Rails’ clean architecture and prescriptive design patterns made web developers highly productive, more productive than they were using a Java or .NET stack. The Rails framework also brought cowboy web developers and scripters into the fold, allowing them to write better architected, cleanly separated, MVC applications (or more accurately, not allowing them to write architecturally poor applications). What is its legacy? / What can we learn? Developers learned again to be open to new languages. Ruby rekindled the debate around the virtues of dynamic vs. static languages. Dynamic languages are better suited in creating DSLs like Rails. The lesson that the JVM is not just for Java created interest in other special purpose languages such as Scala and Clojure. Not every application fits the mould for Rails: working with existing systems and especially with existing data models takes it out of the highly productive zone. But for rapid development of web applications, it remains the first choice for many teams. Concerns raised about Ruby’s scalability (e.g. Twitter replaced some core Ruby parts of their system with Scala) are overblown. Who deserves the credit? Matz for creating Ruby, Dave Thomas and Andy Hunt for sustaining Ruby’s adoption; David Heinemeier Hansson for Rails.
  4. 4. Disruptor #3: Eclipse Disruptor Eclipse What did it disrupt? Incumbent IDEs: Borland JBuilder, NetBeans, Visual Studio (to some extent), Emacs/Vim Why did it disrupt? Eclipse primarily disrupted on price. Getting budget for an IDE was a hassle for a developer in all but the most enlightened shops. Eclipse gave you as much if not more than the commercial IDEs and it meant one less conversation with a pointy-haired boss. Eclipse’s open, extensible, participatory architecture meant a constant stream of innovations that few if any commercial IDE vendors could match. What is its legacy? / What can we learn? Proves extensibility and modularity sustain a platform. Arguably OSGi would not be forging ahead without Eclipse modularity as a “killer app”. Beyond the technology, there’s also a lot for smaller development organizations to learn about effective development practices from the Eclipse Foundation, such as regular release trains and well-thought out review processes . Who deserves the credit? IBM, especially the Ottawa Lab (including Object Technology International), and the other Foundation members .
  5. 5. Disruptor #4: Amazon Web Services/EC2 Disruptor Amazon Web Services ( AWS ) What did it disrupt? Managed hosting providers, incumbent IT departments Why did it disrupt? Dealing with managed hosting providers to get applications deployed meant negotiating bespoke contracts and provisioning that could take weeks. Dealing with infrastructure providers in enterprise IT departments was worse – you needed real influence and it could take months to get the right gear for your development project. Especially with the launch of EC2, Amazon provided development teams with a way of getting stepwise operational efficiencies by treating compute and storage in a way it never had been before: as commodities. What is its legacy? / What can we learn? Once teams start getting access to dev and test resource as commodities, the guild of IT arts and crafts will never be the same. Who deserves the credit? Jeff Bezos for the vision; Werner Vogels and the AWS engineering team for the execution.
  6. 6. Disruptor #5: JBoss Application Server Disruptor JBoss Application Server What did it disrupt? The BEA, Oracle, and IBM Java application server oligopoly Why did it disrupt? Disrupted the major application server vendors on cost. What is its legacy? / What can we learn? Demonstrated another successful open source business model. Who deserves the credit? Marc Fleury
  7. 7. Disruptor #6: Open Source Databases Disruptors Open source databases including MySQL, Postgres, Derby (Cloudscape), Berkeley DB, and InnoDB. What did they disrupt? One-size fits all commercial databases: Oracle, DB2, SQL Server, and Sybase. Why did they disrupt? The weakness of the enterprise software license model was no better demonstrated than with these incumbent RDBMSs. What is their legacy? / What can we learn? This flock of databases is no longer disrupting the major vendors – mostly because they have acquired them. Work on all these databases started more than a decade ago, but their longevity is testament to their usefulness. Thankfully developers and architects are re-examining the data = RDBMS assumption, spawning the “NoSQL” movement. Who deserves the credit? Michael (‘ Monty’) Widenius , Michael Stonebraker and others.
  8. 8. Disruptor #7: Apache Ant Disruptor Apache Ant What did it disrupt? Make, proprietary build systems, and IDE-internal builds Why did it disrupt? Did you ever use make?! One stray space in the makefile and your productivity was shot. The alternative was proprietary build environments, usually inside IDEs which were more productive, but distinctly unportable. What is its legacy? / What can we learn? Test Driven Development and continuous integration and other agile practices are all supported by the build automation methodology kicked off by Ant. Ant was an example of a domain specific language helping to (re)popularize the DSL concept which would lead to acceptance of other XML DSLs. Who deserves the credit? James Duncan Davidson
  9. 9. Disruptor #8: JUnit/xUnit Disruptors JUnit/xUnit What did they disrupt? Eh… nothing. We didn’t automatically test code before 1999. (sic) Why did they disrupt? If JUnit disrupted anything it was developers’ mindset. JUnit changed how a lot of developers wrote code; not just test code, but all their code. What is their legacy? / What can we learn? Supported development teams’ efforts to become ‘test-infected’ and pursue test-driven development. Continuous testing remains the heart of effective agile practices, such as refactoring. Who deserves the credit? Kent Beck
  10. 10. Disruptors #9 & #10: ?? <ul><li>What did I forget? </li></ul><ul><li>What disruptive technology influenced your application development efforts 1999-2009? </li></ul><ul><li>Some say XML and its ecosystem, some say REST … what do you say? </li></ul><ul><li>Please comment on the blog. </li></ul>