7 New Offerings: Oracle Steps Up Its Big Data Game

7 New Offerings: Oracle Steps Up Its Big Data Game

For any big data effort to succeed, an organization needs to figure out how to combine the right data from the right sources to generate the right insights to achieve its goals. Transactional applications might hold data on customer purchases, for instance. But their browsing patterns, loyalty interactions and responses to tweeted offers are probably in web-based systems. You need to pull all the pieces together to solve the puzzle and exploit new opportunities.

Oracle understands this problem well, as evidenced by the four new products and three new services they’ve announced in recent weeks.

The new products include:

  1. Oracle Big Data Discovery—variously dubbed “the visual face of Hadoop” and “the foundation for data innovation,” it offers a straightforward, unified way for business users to explore data from multiple sources and then analyze it and share the actionable results… in minutes, says Oracle. The benefits include radically accelerated time-to-value for big data “projects” plus increased participation by a wider range of business users, adding up to bigger insights all around.
  2. Oracle GoldenGate for Big Data—a Hadoop-based tool that supports streaming of real-time, unstructured data from multiple transaction systems straight into popular big data systems like Apache Hadoop, Hive, HBase and Flume. Essentially it replicates data between systems in real-time in your choice of forms, without impacting source system performance.
  3. Oracle Big Data SQL 1.1—said to offer a query performance boost of up to 40% over previous versions.
  4. Oracle NoSQL Database 3.2.5—which includes several new features including new APIs, as well as improved security, usability and performance.

Together these new products “further Oracle’s vision to enable Hadoop, NoSQL, and SQL technologies to work together and be deployed securely in any model—whether public cloud, private cloud or an on-premises infrastructure.” It’s all about “operationalizing insights” by integrating new data sources with existing infrastructure, applications and databases.

The new big data services, all cloud-based are geared toward helping companies leverage big data specifically for marketing:

  1. Oracle Data as a Service for Marketing is aimed at generating sales leads. It offers a staggering 300 million profiles of business users and companies, which can be used to prospect for new business-to-business customers as well as improve your insight into your customer base and drive smarter cross-channel marketing.
  2. Oracle Data as a Service for Customer Intelligence is designed to provide a clearer picture of customer feedback on products and services, as well as offer insights into emerging trends or customer concerns. Among other data sources, it uses public information from 700 million social networking messages that Oracle collects daily.
  3. Oracle Marketing Cloud for Student Engagement offers templates that universities and others can use to attract students and improve retention among enrolled students. It essentially packages for the higher education vertical a range of existing Oracle cloud services. Similar packages are already available for a wide range of verticals including manufacturing, insurance, entertainment, nonprofits and many others.

These new offerings join two recent Oracle acquisitions (I know I promised I’d stop at seven, sorry…):

  1. The BlueKai platform, “the industry’s leading cloud-based big data platform that enables companies to personalize online, offline and mobile marketing campaigns…”
  2. Datalogix, whose technology “connects offline purchasing data to digital media to improve audience targeting and measure sales impact.”

All these offerings are aimed at helping Oracle customers advance their big data capabilities faster and with greater ease and success. “More people want to use Oracle software without having to run Oracle software,” said Thomas Kurian, Oracle’s VP of product development, at his Oracle OpenWorld 2014 keynote.

Is your IT department looking to respond to business demands for big data analytics that wring new insights and competitive momentum from your Oracle databases? Do you have the expertise you need in-house to address these new challenges while continuing to maintain current databases and applications? Contact Buda Consulting to discuss options for augmenting your core team with an expert Oracle DBA partner that can help with new demands or backstop everyday processes. 


Still Think Your Business is Too Small for Big Data?

Just because your business isn’t that big doesn’t mean you don’t have “big data” issues or a need for big data-style analytics to remain competitive. “Big data” is a relative term—relative to needs and capabilities for making business decisions, that is. Nearly every organization, whatever its size, will sooner or later reach the point where the volume, variety and velocity of the data it needs to analyze exceeds its storage and/or computational capacity, such that accurate and timely decision-making is impacted.

At that point, you have a big data challenge/opportunity. You can throw your data aside and fail to leverage it. Or you can find a cost-effective way to apply analytics to give your company new insights and answers that will help you compete and grow.

Understanding how to manage and analyze data to meet your evolving needs is critical, because big data analytics aren’t one-size-fits-all. You’ll potentially need to capture and integrate diverse sources of structured and unstructured data across everything from standalone department-level transactional systems to social media feeds to web forms to smart devices to external vendors’ services. Once you’ve maneuvered all this data so that query it, business users can begin to exploit it, both in planned and newly perceived ways. 

As you begin thinking about a big data analytics application, top-level planning considerations include:

  • Focus on business value first. What questions do you need to answer? Where does the data reside that you’ll need to process. In blogs, Facebook and other social media? In your customer transactions? Do you want to cut costs? Predict consumer buying patterns? Accelerate time-to-market for new innovations? Forecast supply and demand? Let business needs drive the technical approach.
  • Analytics capability is the bottom line. How will you determine what data is relevant and how it should be extracted, stored, transformed, etc.? If you think you have “too much data,” the real problem is that your analytics environment isn’t properly tuned.
  • Rely on data architecture best practices. “Big” shouldn’t mean “out of control.” Whatever the technology and implementation involved, keep your Oracle DBAs in the loop so that data movement and transformation are effectively planned for and handled.
  • Make sure you have the right resources in place to deliver the business value you’re looking for. Many companies don’t have in-house expertise to manage data effectively, for example. Data analytics expertise is also scarce. Augmenting your in-house skills with an outsourced Oracle DBA can yield the most value in the shortest time at the lowest cost.

Once you know what you want to accomplish you can more effectively plan for implementation. Any big data initiative involves the acquisition, transformation and storage of large volumes of data from multiple source systems, which is then analyzed.

Where will you get the source data you need? How will you handle data queries? How will you optimize performance? How will you manage testing, and against what data? Will you need more network capacity to handle the data movement? How do you integrate the new analytics solution with any existing data warehouse or other key data sources? As your new capabilities are used more and more, how do you monitor performance and plan for growth?

These are some of the central implementation questions your team will need to answer before your big data “challenge” can become an “opportunity” for decision-makers. Planning at the IT infrastructure level is, of course, critical—but knowing what business questions you want to answer should be the driving force behind the initiative.

To ensure your big data analytics application can generate reports quickly, in appropriate formats, which provide the insights your business demands, contact Buda Consulting. A free consultation with us can be an ideal way to explore your big data analytics challenges, with an eye toward best-practice data modeling, database design, performance and more.


Financial Services Spotlight: Risk Data Aggregation and Risk Reporting

The finance, risk and compliance departments of any financial services firm all need fast and comprehensive access to business data in order to measure performance, manage risk and report to regulators and clients. But each department needs a specific view, whether strategic, operational or a combination of the two.

Risk data aggregation in particular has garnered considerable attention since the Basel Committee on Banking Supervision (BCBS) published Principles for effective risk data aggregation and risk reporting, often called BCBS 239, in 2013. Banks are required to be fully compliant with all eleven principles of BCBS 239 by January 1, 2016—and many will require considerable resources and expertise to get there.  

In the past, risk managers have often had to decide for themselves what data they needed. But regulators are now specifying more about what a risk management analytical framework needs to look like. The goal is to help financial services institutions individually and collectively to avoid counterparty risk and systemic risk, to help prevent a repeat of the 2008 financial crisis.

One of the key lessons learned in the aftermath of the 2008 crisis was that financial services organizations’ IT systems and data architectures were insufficient to enable the management of financial risks, especially around aggregating risk exposures and identifying concentrations of unacceptable risk quickly and accurately at the group level and across lines of business.

Data aggregation frameworks can offer a complete view of the risk inherent in each exposure, counterparty, customer, product and so on—in minutes rather than days. Having the right information at hand to make an optimal decision quickly can make an enormous difference. The delay in understanding what a bank’s total exposure to Lehman Brothers was at the peak of the crisis is a cautionary indication of the value of such a system.

But despite the clear mandate and clear benefits of developing compliant risk data aggregation and risk reporting, the Basel Committee’s December 2013 preparedness survey of thirty “systemically important banks” showed that these banks self-rated their compliance at 2.8 overall on a scale from 1 (noncompliant) to 4 (fully compliant). Principles 2, 3 and 6 (data architecture/IT governance, accuracy/integrity and “adaptability,” respectively) scored the lowest at around 2.5/2.6. Half the respondents indicated that they were far from being compliant in these areas. Since the principles are all interdependent, presumably some weak spots would make overall compliance a significant challenge.

To comply with BCBS 239 in time, financial services companies will need to:

  • Automate manual processes to accelerate data management and analytics
  • Consolidate today’s disparate views of risk
  • Bolster the reliability of risk systems and risk data quality assurance
  • Improve risk data governance, data ownership and procedural documentation

According to experts at Oracle, speeding up data management and analytics practices is key to avoiding the kind of risk that rocked the world in 2008. Firms that embrace near real-time/on-demand analytics and similar data management technologies will be able to aggregate data much faster across different classes, different lines of business and different data structures—including unstructured data. This will enable them to better pinpoint and evaluate risk to predict problems before they become catastrophic.

Of course, data quality is also vital. How can a firm calculate or predict risk exposure if its data is unreliable or incomplete?

Due to the complexity and diversity of many financial services firms’ data management systems, an objective, third-party assessment of where you are and where you need to go can be the best way to move “with all deliberate speed” towards compliance with BCBS 239.

Buda Consulting has over fifteen years’ experience with building and assessing these kinds of complex, mission-critical database applications. Get in touch to discuss how we can help you evaluate and address your risk data aggregation and reporting challenges.