Tuesday, February 11, 2014

What's Next for Big Data?

Big Value.  Broad Usefulness.

The time for putting all data to work is now.  The cost of collecting and using all data is plummeting. Meanwhile, the tools to access and work with data are becoming simpler. New, big, multi-structured data sets combined with the significant architectural advances of cloud computing has created advantages accessible to any organization.  Most profoundly, the next few years will usher in a new class of application, one that uses all available data and is built on the superior economics of cloud-based computing. This class of applications will help make big data small and will be purpose-built to solve an enormous array of real business problems. This is what’s next for Big Data.

Cost reduction and improving economics are key ingredients to pervasive use of data.  Open source projects are fueling the rise in usefulness of Big Data and creating disruption on a grand scale.  Apache Hadoop and Cassandra are some of the best known examples, but represent just tips of the iceberg. A recent IDG survey showed that 70% of enterprises already have or are planning to implement a Big Data project and about 30% of respondents reported using an open source big data framework, such as Hadoop. Our own Big Data survey described a doubling of Big Data projects (versus our same survey last year) as well as a marked reduction (47% decrease) in the confusion about the definition and usefulness of Big Data. 

I’ve written previously about the amazing advancements in sensor technologies, storage resources and compute cycles – which create a perfect, low cost foundation for developing and delivering new applications that harness huge data sets to new effect. Two of my favorite examples of this new generation of applications in action are Clockwork and TrackX (formerly Fluensee).

Clockwork has created a modern, cloud-based Enterprise Asset Management system that is transforming the management of capital-intensive assets for any size of organization.  Clockwork recently reported that it has saved its customers more than $3 billion in the past year by combining sensors and sophisticated software to fundamentally improve the maintenance, repair and operations of their assets.

TrackX creates software and technology to effectively track and manage physical assets using RFID, barcode, GPS and sensor technologies with its rapidly-deployed, cloud-based software.  A key ingredient in TrackX solutions is analytics, because its mission is to provide insight on all asset-related data. TrackX takes all asset attributes and measurements and creates an analytical picture that provides customers with actionable intelligence to better optimize time, labor and asset productivity.  This insight is cleverly designed-in its application software, helping anyone in the organization to become more analytically-astute.

So, What’s Next for Big Data?

The two large growth areas for analytics are: 
1. making data scientists far more capable by giving them more useful and cost-effective tools than ever before, including Jaspersoft’s recent innovations enabling rapid blending of big and small data through data virtualization;
2. making everyone else in the organization more capably analytical by providing them with just the right amount of analytics at the right place and the right time (no more, no less) within the applications and processes they use every day. As a result, they can make better data-driven decisions than ever before.

The second growth area above will have the broadest, most transformational effect on an organization because it can involve everyone, regardless of level or title, making decisions on a daily basis. IDG’s recent survey found that 42% of the respondents plan to invest in these applications in 2014, making this one of the top five areas for investment. 

This investment level should provide us with great hope, because, although we’re in the very first phase of a revolution in data and analytics, the evidence is now clear.  Dramatically lower costs for capturing and using all the data, combined with a new breed of cloud-based analytical applications, will power huge gains in value and usefulness for data in any organization.

Monday, January 20, 2014

Trend #4: Follow the Money, Business-led Tech Innovation Will Disrupt and Drive Growth

Four Trends That Will Shape 2014 

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe
will make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series through my first, second and third installments and now my fourth (and final) below.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well. I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us.

Putting more data to work drives innovation.  Innovation can transform processes, products, services and people. Our newfound ability to cost-effectively analyze and find hidden patterns in huge swaths of data will enable a new generation of business-led technology innovation.  With this trend, the IT organization must find new ways to integrate and collaborate within the enterprise, becoming an enabler of business-led innovation.  This collaboration is more important than ever as technology now defines the new economic battleground for every industry and organization.  Even Gartner’s latest predictions abound with a Digital Industrial Revolution theme and watchwords for CIOs and their IT organizations to either lead or get out of the way.  It’s a bold new world.

All companies are now technology companies.  Every organization must put technology to work in ways that create distinction and competitive advantage.  Evidence of this trend can be found in any industry-leading company today, where IDC says that business units already control 61% of tech spending.  Fortunately, the technological barriers to entry have never been lower.  Organizations of all sizes now have affordable access to powerful, enterprise tools, which levels the playing field, allowing even small companies to compete with the big guys (sometimes even more effectively, because of their nimbleness).  One example is AirIT, which can help every airport become a technology-enabled data center, driven by metrics relevant to the business, which in turn, streamline operations and save money.

Leading enterprises will overtly staff, skill and organize to maximize innovative uses of technology – creating a cascade that will impact education, training and personnel management in all corners of the organization.  Even military organizations realize that gaining skill and expertise in data and analytics will remain at the forefront for personal advancement.  The risk for all is that a concentration of even deeper technology skills will create digital haves and have-nots within industries, creating a difficult spiral for laggards.

Lastly, in order for business-led tech innovation to really flourish, many more knowledge workers (than today) must have access to just the right amount of data and analysis, at the right place and the right time (not too much, not too little), which promises to make everyone into a more capable analyst and decision maker (regardless of job level, title and even skill).  In 2014, analytics becomes a thing that you do, not a place that you go and the need for intelligence inside of the applications and business processes we use every day becomes an agent of business-led tech innovation.

Monday, January 13, 2014

Trend #3: From Schema-Constrained to Idea-Constrained, The Real Big Data Opportunity

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe
will make 2014 a transformative year for reporting and analytics.  Think of this as my travel log and diary distilled into a short, easily readable series of blog posts.  I invite you to follow this short series through my first and second installments and now my third below. Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us.

Trend #3:  From Schema-Constrained to Idea-Constrained, The Real Big Data Opportunity

In the past (and too often today), we collected just the data that we could afford to store and for which we had a clear, known use.  In this sense, we were hard-wired to winnow the data down to its most obvious and practical subset; thus, we were (and are) schema-constrained. By this I mean that today we must know, in advance, the uses for data as they are being captured.  This mindset leaves little-to-no room for future, latent value that may exist within a data set.  In physics, we recognize that energy has an immediate value (kinetic) and a future value (latent).  Why should data be any different?  

As costs have declined and the power of technology has increased exponentially, we now have the ability to store and use ALL of the data, not just some of the data. But, we may not always know the value of this data while it is being captured.  That’s okay.  The latent value of data will become more obvious each year and the technology now exists for this to be the norm.  In this sense, the real big data opportunity is based on the scale of our ideas to put data to work, finding new correlation and value where it previously was not discernible.  

Unlocking this new value becomes easier as the world is increasingly digitized; that is, we now regularly put very new data types to work:  geo-position / location, sensor updates, click streams, videos and pictures, documents and forms, etc.  Just a few years ago, almost none of this would have been considered “data”. Commonly using all these new data types and searching for correlations that can positively impact business will shift the primary constraint to the quality and quantity of our ideas.  

Perhaps my favorite example of latent data use is the world’s first consumer light field camera from Lytro.  The Lytro camera captures and processes the entire light field (11 million rays of light, to be precise).  This allows every shot to be filtered and viewed from any perspective, after the shot, allowing the user to later uncover what the best focus, angle, and zoom, might yield.  Lytro refers to the result as a “living picture” with a 3-dimensional feel.  What a beautiful use of big, latent data.

In 2014, we’ll move from being schema-constrained to idea-constrained, more often finding the real value in Big Data.

Monday, January 6, 2014

Trend #2: Cloud-Based Delivery Will Change The Analytics Consumption Pattern

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe
will make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series with my first installment here and my second below.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us.

As cloud-originated data growth accelerates, the attractiveness of using additional cloud-based data management and analysis services will grow too. I’ve previously written about an entirely new generation of platform and middleware software that will emerge to satisfy the new set of cloud-based, elastic compute needs within organizations of all sizes.  Commonly at this software layer, utility-based pricing and scalable provisioning models will be chosen to more strictly match consumption patterns
with fees paid.  These improved economics, in turn, will enable broader use of platform and middleware software, especially reporting and analytics, than ever before possible.  

Additionally, cloud-based delivery portends a level of simplicity and ease-of-use (consumer-like services) that defies the earlier generation of enterprise software, ushering in deeper consumption of analytics by organizations of all sizes.  In short, cloud-based delivery becomes a key component of the quest for pervasive analytics – especially when those analytics are delivered as components within the web applications we use every day.

According to Nucleus Research: “As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.” In 2014, cloud-based delivery will change the analytics consumption pattern.

Monday, December 30, 2013

Four Trends That Will Shape 2014

My favorite part of being CEO of Jaspersoft is spending time directly with our customers and partners.  I learn so much about how we are succeeding together and what Jaspersoft can do better to help them succeed more fully.  Learning from our customers is a passion at Jaspersoft.  We all take notes from our customer discussions and share them throughout the company in an effort to constantly grow our understanding and push us forward.  Their plans and ideas become our ambitions - to create a far better approach to the next generation of reporting and analytics.  Our hope and intention is that our customers will be compelled to rely on Jaspersoft for a continually larger portion of their projects and needs.

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe will make 2014 a transformative year for reporting and analytics.  Think of this as my travel log and diary distilled into a short, easily readable series of blog posts.  I invite you to follow this short series, starting with my first installment here.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and
partnership, which mean everything to us.

Trend #1:  Forget Sentiment Analysis, Sensors + Software Will Change the World

Much of the Big Data hype has focused on social media and sentiment analysis, in an effort to get closer to the customer and better understand the market in which an organization competes.  While this is a valid goal, relatively few organizations will find both the skill and useful data patterns that add up to a material top-line difference.

Instead, the focus should be on the “Internet of Things”, for the transformative power it represents.  Every day, I see more powerful examples of sensors and software in action.  I prefer to describe it as “sensors + software” because these terms better symbolize the grittier, more real-world value that can be delivered by measuring, monitoring and better managing vast amounts of sensor-generated data. Why is this important in 2014?  Firstly, sensor technology has become remarkably low cost (an RFID tag, for instance, can cost as little as 50 cents, according to this report - which means more data points).  Secondly, the data storage and analytic technology to capture and analyze this data is incredibly low cost and widely available (often in open source editions). Lastly, sensor-based data is well suited for correlation analysis, rather than looking strictly for causation, which increases the potential for finding value among this machine-generated data.

Analyst predictions are vast for the economic and far-reaching value of making “Things” smarter and connecting them to the Internet.  Why limit analysis to the words and attitudes of a relatively vocal few (social media and sentiment analysis), when you can analyze the actual behavior of a much larger population (sensor data)? So, I believe a quiet revolution is already underway.  In 2014, sensors + software will change the world.

Monday, April 8, 2013

Amazon, Jaspersoft, and the Future of Cloud Computing


Last month, Jaspersoft announced the industry’s first completely pay-as-you-go reporting and analytic service on Amazon’s AWS Marketplace.  With this service, you can literally be up-and-running (analyzing your data) in less than 10 minutes and pay as little
as 52 cents per hour to do so.  And, as we’ve just announced, Amazon and Jaspersoft added more than 100 customers during the first month of availability – a great start to a new service destined to change the way BI is consumed for many purposes.

One of my favorite University professors recently asked me what worries me the most about being on the cutting edge with Amazon and this new service.  My response:  NOT being on the cutting edge with Amazon and this new service.  In other words, I would worry most about not innovating in this way.  Disrupting through both our business model and product innovation is a critical part of our culture at Jaspersoft.

In fact, the early success of our new Amazon-hosted service reminded me of two fast-emerging, inter-related cloud computing concepts that, though not discussed sufficiently, will have a substantial impact on the future usage and adoption of cloud-based computing services. These two concepts are: cloud-originated data and the post-transactional cloud *1.  I maintain that, as the former quickly grows, the latter becomes commonplace.

Cloud-Originated Data
While the total digital universe currently weighs in at nearly 3 zettabytes, it is estimated that more than one Exabyte of that data is stored in the cloud.  Each day, the growth rate of cloud-originated data increases, because of the explosion in services and applications that rely on the cloud as infrastructure.  So, a disproportionate amount of the 13X growth projected in the digital universe between now and 2020 will come from cloud-originated data. IDC estimates that by 2020, nearly 40% of all information will be “touched” by cloud computing (somewhere from origination to disposal).  Eventually, most of the digital universe will be cloud-based.

The growth in Amazon’s Simple Storage Service (S3) provides another compelling data point for the growth of cloud-originated data. In the past several years, Amazon’s S3 service has seen meteoric growth, now storing nearly one trillion objects (growing by 1 billion objects per day) and handling more than 650,000 requests per second (for those objects). The chart below illustrates this dramatic growth *2.



Importantly, cloud-originated data is more easily liberated (post-transaction) by other cloud services, which can unlock additional value easily and affordably.  According to a recent report by Nucleus Research, companies that more quickly utilize cloud-based analytics are likely to gain a competitive advantage:

“As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.”

Ultimately, analytics is just one of many important post-transactional uses of cloud-based
data, which will surely be the subject of future posts.

Post-Transactional Cloud
My working definition of the post-transactional cloud is “the next-generation of cloud services, beyond Software-as-a-Service (SaaS), designed to enable platform and middleware tools to use cloud-originated transactional data and deliver a richer, more sophisticated computing experience.”

The concept of a post-transactional cloud provides a powerful analog that mirrors the history of the on-premises computing world. Let me elaborate.

The ERP/CRM/Supply Chain application boom of the ‘80s and ‘90s preceded an enormous need in the ‘90s and ‘00s for additional tools and software systems designed specifically to create even greater value from the data generated by these (on-premises) transactional applications. Then, tools for data management, data integration, data warehousing and business intelligence (reporting and analytics) were born to deliver this new value.

Cloud computing has grown substantially in the last 10 years largely because of applications hosted in the cloud and made available as a service directly to consumers and businesses.  The poster
child here is Salesforce.com (although there are thousands of others).  Note that we call this category “Software-as-a-Service” when it really should be called “Application-as-a-Service” because the
providers in this category are delivering a transactional, process-oriented application designed to automate and improve some functional aspect of an organization.  As the use of these managed services/applications grows, so too does the quotient of cloud-originated data generated by these applications.

The dramatic rise in cloud-originated data from SaaS applications portends a similar need: this one for post-transactional cloud-based tools and software systems to define a new usage curve for liberating cloud-based data and creating substantially new organizational value. It’s just a matter of time. Which makes Jaspersoft’s work with Amazon clear and understandable.

In fact, Jaspersoft’s cloud-based service (across all major Platform-as-a-Service environments, such as VMWare’s CloudFoundry and Red Hat’s OpenShift, but right now, especially with Amazon’s AWS) helps ensure our tools are the de facto standard for reporting and analysis on cloud-originated data (in the post-transactional cloud). We’ll do this in two ways:
1. By bringing our BI service to customers who already prefer to use cloud services, and by being available in their preferred cloud instead of forcing them into our cloud; and
2.  By enabling elegant, affordable, embeddable reporting and analysis within cloud-based applications, so those who deliver this software can include intelligence inside their transactional applications.
At Jaspersoft, we ultimately see our cloud-based service as vital to reaching the broadest possible audience with just the right amount of reporting and analytics (not too much, not too little).  The post-transactional cloud will be fueled by cloud-originated data and the need to deliver cleverly-designed intelligence inside this environment will be more important than ever.

Brian
Gentile
CEO, Jaspersoft


1 I’ve borrowed the term “Post-Transactional Cloud” from ZDNet’s Andrew Brust, in his article entitled “Amazon Announces ‘Redshift” cloud data warehouse, with Jaspersoft support”.
2 Data and chart excerpted from TechCrunch article “Amazon S3: 905 Billion Objects Stored, 1 Billion Added Each Day”, Sarah Perez, April 6, 2012.


Tuesday, November 13, 2012

The Intelligence Inside: Jaspersoft 5

For more than three years, Jaspersoft has envisioned the capabilities we’ve just announced in our v5 platform. Because we’ve always intentionally constrained ourselves by exclusively delivering client (end-user) reporting and analysis functionality inside the web browser, our quest for v5 took longer than we would have wanted. But, we believe that the strengths and advantages of maintaining our simple, pure, web-server-based approach to advanced business intelligence is superior to relying on desktop-specific code or even browser plug-ins, which must be installed and maintained on every computer, preventing the scale and cost advantages Jaspersoft can offer.

So the interface techniques and features we deliver are constrained based on key web client technologies, especially HTML. The trade-offs we’ve lived with in the past, though, are now essentially eliminated, as a new generation of HTML5 ushers in the consistent, advanced visualization and interaction we’ve long-wanted, while allowing us to maintain our pure web-based client delivery model. Satisfaction. Jaspersoft 5 is more than a new pretty face. We have delivered a completely new HTML5 visualization engine that allows a new-level of rich graphics and interaction, but we’re also providing a host of new and more advanced back-end services that make Jaspersoft 5 more surely the intelligence inside apps and business processes. In total, Jaspersoft 5 includes six major new features.

1. Data Exploration 
To enable everyone to become a more capable analyst, the Jaspersoft 5 platform includes stunning HTML5 charts, a new dimensional zoom tool (for exploring data at more or less levels of detail), and the ability to simply change or customize charts and tables to suit a particular type of thought or analysis.

2. Data Virtualization 
Some reporting and analysis applications are best delivered without moving or aggregating data. Instead, the query engine should virtualize those data views and enable reports, dashboards and analytic views to include data from all necessary sources. Jaspersoft 5 includes an advanced data virtualization engine so that building advanced analysis using practically any data source is straightforward, including Big Data sources.

3. Columnar In-Memory Engine 
The JasperReports server has supported in-memory operations for several years. Jaspersoft 5 takes this to a new level with improved performance, features, and now with support for up to a full Terabyte of in-memory data. This means that billions of rows of data can be explored at memory speeds with our new Server.

4. Enhanced Analytics 
To give the power user analyst another reason to use Jaspersoft, we’re now including greater analytic performance, new analytic features (e.g., conditional formatting, relative date filtering, and cross-tab sorting), consistently rich visualization (see #1 above) and broadened access to multi-dimensional data sources. By supporting the latest XML/A standard, we gain certified access to Microsoft SQL Analysis Services (MSAS) data sources in addition to the traditional Mondrian. More power and greater choice equals greater usage.

5. Improved Administration and Monitoring 
To make the lives easier of those who administer and manage a JasperReports Server, we’re now using our own tool to make our Server smarter and simpler. We’ve designed a set of best-practice, interactive reports that display system health and report on the most important elements of usage. Then, we streamlined the installation and upgrade process, so that getting started and staying up-to-date has never been easier. Together, these improvements are good for our customers and our technical team who supports them.

6. PHP Support 
Scripting tools are now the most popular for web application development. The PHP community needs more advanced reporting and analysis tools to make their applications more data-driven. By extending the JasperReports Server API to now include PHP support (via RESTful web service wrappers), we’ve taken an important first step toward supporting this fast-growing world beyond Java. Welcome to Jaspersoft.

Jaspersoft 5 is poised to deliver self-service BI to help many more users answer their own questions, not just because of the beautiful new HTML5 graphing and interaction engine, but because it is designed to be highly embeddable (into apps and business processes) and, maybe most importantly, because it scales so powerfully and affordably. Putting reporting and analytics into the hands of far more users requires this fundamental reset of the BI formula. This is Jaspersoft 5.

I invite you to learn more about Jaspersoft 5 here. And, I look forward to your comments and questions. 

Brian Gentile 
Chief Executive Officer 
Jaspersoft