6. Reality: Typical mobile app release schedule
Features +
Defect fixes
V 2.1
V 2.0
V 1.2.1
V 1.2.2
V 1.2
V 1.0
(MVP)
Regression + Emergency Patch
New OS
version
released
Regular Internal Sprint Cycle + Beta Testing (2 weeks)
Jan
Feb
Mar
Apr
May
Jun Jul
Aug
Sep Oct
Source: Hammond, Jeffrey. Forrester Research, IBM Innovate 2013, June 2013
Nov
Dec
8. Multi-tier mobile apps specific challenges
MANY:
Targets
Provisioning rules & artifacts
App Stores
Backend service versions
Client Devices
Middle Tier Server
Coordinate Pipelines for each tier
Back-end Data & Services
13. Automate Application Deployments to Worklight 6.0
Example Use Case:
Deploy app to the mobile environment, and run
automated tests against it
Sample
ENVIRONMENT
Worklight Server
Rational Test
Workbench
(headless)
Application Center
•
Console
•
Emulators/Devices
Under test
15. IBM UrbanCode Deploy
Code
Station
Deploy Worklight
project settings
Sample
ENVIRONMENT
Worklight Server
Rational Test
Workbench
(headless)
Application Center
•
•
Console
WAR (proj settings)
Emulators/Devices
Under test
19. Distributed DevOps Pattern
Mobile apps need to be updated and released rapidly. Eliminate delays waiting for operations teams to setup
test labs and longer cycles required to integrate with existing apps/services, and processes.
IBM UrbanCode
Deploy
IBM UrbanCode Release
Rapid deployments
Dev
Mobile
Build
Device
deployment &
testing
AppStore
Mobile Tier (SOE)
IBM UrbanCode
Deploy
Frequent deployments
Dev
Build
App Deploy &
testing
Web Services
Tier (SOE)
IBM UrbanCode
Deploy
Few deployments
Mainframe transactional services
Back-end Tier (SOR)
Integration
Test
Production
Environment
22. Please note the following
IBM’s statements regarding its plans, directions, and intent are subject to change or
withdrawal without notice at IBM’s sole discretion.
Information regarding potential future products is intended to outline our general product
direction and it should not be relied on in making a purchasing decision.
The information mentioned regarding potential future products is not a
commitment, promise, or legal obligation to deliver any material, code or functionality.
Information about potential future products may not be incorporated into any contract.
The development, release, and timing of any future features or functionality described for
our products remains at our sole discretion.
Performance is based on measurements and projections using standard IBM
benchmarks in a controlled environment. The actual throughput or performance that any
user will experience will vary depending upon many factors, including considerations
such as the amount of multiprogramming in the user’s job stream, the I/O configuration,
the storage configuration, and the workload processed. Therefore, no assurance can be
given that an individual user will achieve results similar to those stated here.
22