Anda di halaman 1dari 11

Oracle

PAGE 14
THE EVOLUTION OF
ENTERPRISE CLOUD
TECHNOLOGY

Quest
PAGE 16
TODAY’S DBA AND
THE CHALLENGES OF
MANAGING MULTIPLE
DATABASE ENVIRONMENTS

Aerospike
PAGE 18
FORRESTER SURVEY
HIGHLIGHTS TREND
TOWARDS HYBRID
MEMORY ARCHITECTURE

Pure Storage
PAGE 19
DELIVER BETTER
ORACLE APPLICATION
PERFORMANCE

MariaDB
PAGE 20
INTELLIGENT QUERY
ROUTING FOR SCALABLE,

Database
HIGH-PERFORMANCE
DATA ACCESS

Percona Performance:
NEED for
PAGE 21
THE EVOLVING OPEN

The
SOURCE DATABASE
LANDSCAPE

SPEED & SCALE


Best Practices Series
12 OCTOBER/NOVEMBER 2017 | DBTA

ONE COMPLETE MARKETING PROGRAM

ADVANCING
the Data-Driven
ENTERPRISE
Best Practices Series

These are interesting times for the database world and its participants. Today, multiple forces—big data
consumption and processing to the Internet of Things to real-time analytics to artificial intelligence
and machine learning—are kicking database performance demands into high gear. All these initiatives
mean unprecedented opportunities to advance the evolution of many businesses into data-driven
enterprises, in which decisions at all levels are based on analytics. In this real-time economy, the faster
data analytics can be delivered, the better.

Enterprise business leaders remains the greatest issue holding back Boyarski, and Eric Boutin, in Data
understand this requirement. A the rapid delivery of data to decision Warehousing in the Age of Artificial
majority of more than 300 managers makers, the survey finds. Close to half of Intelligence. More needs to be done,
and professionals surveyed, 57%, state data managers and professionals report because time is of the essence—a global,
their business leaders now rely heavily this is the main technical challenge hyper-competitive economy demands
on analytics in their day-to-day decision they face when it comes to moving nothing less.
making. The research from Unisphere/ toward real-time data delivery. Data Here are key actions that need to be
Information Today, Inc. also points to warehouse queries or reports often take taken to allow data to move at maximum
issues with becoming a data-driven time to produce. Close to one-third of velocity to meet the demands of this new
enterprise (“Moving Data at the Speed respondents, 31%, say the time to run environment:
of Business 2016 IOUG Survey on Data these reports often will take longer than Ensure that the business drives
Delivery Strategies”). A majority of an hour. In fact, close to one in 10 says it data speed requirements.
survey respondents complain about a could take longer than a business day to There is information that is required for
lack of complete information. Plus, most run a report. real-time functions, such as machine
organizations are not where they want to Organizations are employing a range learning-driven algorithms, but there is
be in terms of data delivery. of new strategies and approaches to also information in which latency is not
Leading inhibitors to moving data improve the speed of data delivery and an issue.
faster are database performance issues integration. “Building any real-time
(cited by 48%) and network performance application requires infrastructure and Be able to support both real-time
(45%). Data quality also stands out as a technologies that accommodate ultra- versus historical information.
leading roadblock to the faster delivery of fast data capture and processing,” write There needs to be a distinction drawn.
data, cited by 45%. Database performance Gary Orenstein, Conor Doherty, Mike “Today’s applications thrive using both
OCTOBER/NOVEMBER 2017 | DBTA 13

real-time and historical data for fast, ments, however, and greater industry approach in which data is stored and
accurate insights,” according to Oren- education is called for. Close to half of indexed. A survey of 350 enterprises by
stein and his co-authors. “Historical data data managers and professionals, 46%, Confluent finds a continuing embrace
provides the background for building an admit they have only a basic, limited, or of Apache Kafka, the open source data
initial algorithm to score real-time data as minimal understanding of the technol- streaming platform. The overwhelming
it streams into an application.” They add ogy, and 8% say they have no knowl- majority, 86%, indicate the number of
that “converging traditional online trans- edge at all of in-memory. In-memory is their systems employing Kafka is on the
action processing and online analytical seen as having a vital role to play in the upswing, and 20% report this growth
processing systems requires the ability to success of data-driven enterprises in the was significant. Fifty-two percent report
compare real-time data to statistical mod- months and years ahead. A majority of having at least six systems running
els and aggregations of historical data. To managers and professionals, 52%, regard Kafka, up from 41% over the previous
do so, a datastore must accommodate two in-memory technologies as critical to year. Streaming data platforms will play
types of workloads, without compromis- their organization’s competitiveness. a key role in processing data flowing
ing on latency.” in from sources such as the Internet of

Exploring artificial intelligence and


machine learning.
These are capabilities that will enable
adjustments and insights that do not
require manual intervention. “To
Cloud providers offer the scale and
enable real-time machine learning, you throughput essential for high-performance
will need a modern data processing
architecture that takes advantage of data processing that may be too cost-
technologies available for ingestion, prohibitive for organizations to build
analysis, and visualization,” according
to Orenstein and his co-authors. A and maintain on-premises.
McKinsey survey of more than 3,000
organizations finds AI adoption is in its
infancy, with half using three or more
AI technologies within their premises.
Another 41% are piloting AI efforts. Draw on cloud resources. Things and cloud-based sources.
The five categories of AI technology Cloud providers offer the scale In an environment that demands
identified by McKinsey include robotics and throughput essential for high- real-time insights, enterprise decision
and autonomous vehicles, computer performance data processing that may making is weighed down by incomplete
vision, language, virtual agents, and be too cost-prohibitive for organizations and slow-moving information. Current
machine learning. to build and maintain on-premises. approaches may not be enough to
While many enterprises are turning to integrate and deliver data within
Install in-memory technologies. cloud-based data delivery strategies, minutes or seconds, as required in
There are a range of in-memory capabil- it’s going to take some time until many of today’s businesses. This is due
ities now available within leading data- significant workloads will be moved to continued inadequate performance,
base products, as well as open source into these environments, the survey siloed data, and slow response times
environments such as Apache Spark. shows. Currently, 10% of managers from existing data. This calls for a new
This is a high-performance technology and professionals say they have most of data architecture, and new approaches
that is a must for any high-performance their data integration workloads being to data integration are needed. With
data environment. In-memory databases handled by cloud. Overall, about half of the rise of cloud, big data, and machine
and platforms offer an option to rapidly respondents indicate that there are at least learning, as well as the need to deliver
accelerate data analytics, making it a key some workloads moving in this direction. information at real-time speeds,
step toward real-time integration and organizations are looking at an array of
delivery. About 28% of organizations Turn on data streaming. newer options to support analytics for
now employ in-memory technologies, Data streaming technologies and services their enterprises. n
and another 23% are piloting or evaluat- process information and insights that
ing the approach. Lack of understanding move through their systems on a real- 
may be inhibiting in-memory deploy- time basis, versus the more traditional  —Joe McKendrick
14 OCTOBER/NOVEMBER 2017 | DBTA Sponsored Content

The Evolution of Enterprise


Cloud Technology
Cloud technology has changed every- why: This gives them maximum flexibility ORACLE TURNS OBSTACLES INTO
thing—it has been one of the biggest and leverage with their existing investment, OPPORTUNITIES
transitions in the IT industry over the past and their gradual adoption of the cloud. Oracle Cloud Machine, Oracle Exadata
three to four decades. Today’s customers The cloud is appealing, offering rich data Cloud Machine, and Oracle Big Data Cloud
require greater agility and faster innova- services, global reach, considerable elasticity, Machine are the first solutions provided
tion, and need the flexibility to run their pure OpEx, and more. But not everything as a part of the Oracle Cloud at Customer
workloads both in the public cloud and on can run in a public cloud … at least, not service. Customers can utilize Oracle Cloud
premises with a predictable and transpar- today. One issue customers have had with services with Cloud at Customer in the areas
ent IT cost structure. As a result, they’re public clouds concerns data sovereignty and of Infrastructure, Data Management, Big
looking for application-delivery engines compliance. They need to address the issue Data and Analytics, Application Develop-
to speed up new application deployments that certain workloads and data must reside ment, Enterprise Integration, and Security.
and provide the services their business within the customer’s country borders, or In addition, Oracle provides a complete
demands to be competitive. These applica- behind the firewall within their data center. Software-as-a-Service portfolio, including
tions must be developed quickly by small Enterprise Resource Planning, Human Cap-
teams, consume cloud services as building SIGNIFICANT BARRIERS REMAIN ital Management, Customer Management,
blocks, release daily, deploy quickly, and and Supply Chain Management, for custom-
scale across multiple data centers. What’s ers to use with Oracle Cloud at Customer.
more, companies require application Oracle is bringing cloud innovation to
development and data processing that the customer, enabling enterprises to gain
work behind their corporate firewalls in all of the benefits of PaaS, IaaS, and SaaS
order to guarantee security and abide by in their data centers while maintaining:
data-governance regulations. • Tight control over data sovereignty and
governance concerns
CLOUD ADOPTION • Full integration with network security
RATES ARE HIGH • Low latency interaction with your other
The cloud has completely transformed on-premises applications and data

While cloud adoption is high, a rela- CLOUD AT CUSTOMER PROVIDES


tively small percentage of all workloads A STEPPING STONE IN THE
are deployed in the cloud. Customers need JOURNEY TO THE CLOUD
to manage workloads seamlessly between This is because it allows customers to
on-premises environments and clouds. unlock the benefits of the cloud faster,
And there are valid reasons why the public easier, and with less disruption.
cloud is not always an option: Customers can now leverage the latest
Data sovereignty innovations and take advantage of the rapid
Companies are required to comply with development that the cloud provides—
regulatory, legal, and privacy requirements increasing their productivity, reducing their
and keep sensitive data on premises. costs, and providing unprecedented flexi-
Data control bility on how they consume cloud services.
enterprise IT, and as the technology and They must also maintain control of Managed by Oracle, with single-vendor
services available have matured, adoption business-critical systems, and often want a accountability, Oracle Cloud at Customer
rates have soared. dedicated infrastructure within their own delivers increased productivity and reduced
But how do organizations make the data center. costs, while controlling data sovereignty and
choice about where to deploy their applica- Latency governance concerns. n
tions? Most customers want to combine the In addition, they need to connect
best of cloud and on-premises applications with back-end mainframes, databases, ORACLE
into a single architecture, and it’s easy to see ERPs, and more, with near-zero latency. www.oracle.com
Sponsored Content OCTOBER/NOVEMBER 2017 | DBTA 15

PRAGMATYXS MAXIMIZES STAFF PRODUCTIVITY,


ENHANCES DATA SECURITY, AND BOOSTS INNOVATION
BY MOVING DATABASE TO THE CLOUD
Pragmatyxs, Inc. is a boutique consulting and soft- • Delivered efficiency and ensured true 24/7 avail-
ware firm that works with Fortune 500 clients to ensureability of Oracle Database—a constant challenge for
smooth communications between their supply chain, small companies because, while IT teams may be on
quality and business systems, as well as the barcode call 24/7, there is often a time delay in both reaction
and product labels mandated by regulatory bodies. and resolution to any out-of-hours issues
The company has leveraged Oracle technologies since • Leveraged familiar technology by simply mov-
its inception in 1995 and it has been an Oracle partnering its on-premises Oracle Database and Oracle
for the last 15 years. Java solutions to the cloud, eliminating administra-
Pragmatyxs had an on-premises Oracle Database tive costs and accelerating configuration for new
environment and was using Oracle Java for product environments
development. The company decided to migrate these • Launched a label printing cloud service for Prag-
environments to Oracle Platform as a Service (PaaS) matyxs’ partners and remote facilities in less time and
to reduce the amount of time that its staff spent on at a better value with Oracle Database Cloud Service
maintenance and support—ultimately enabling them and Oracle Java Cloud Service
to focus more attention on innovation instead of trou- • Selected Oracle Cloud as a fully managed
bleshooting IT issues. PaaS—a strategic initiative for its business that
CHALLENGES enables it to compete for effectively
• Boost competitive advantage by accelerating time • Improved customer satisfaction by enabling
to market and value of new solutions designed to inte- Pragmatyxs to offer its business-to-business custom-
grate clients’ business and supply chain systems ers uniform, standards-based integration indepen-
• Help clients to maintain and increase success dent of the client form factor (web browser versus
through implementation of accurate product labeling, mobile device, for example) with Oracle Identity
bar coding, and other tracking solutions as mandated Cloud Service
by regulatory bodies • Empowered customers to streamline regulatory
• Increase business productivity by minimizing the compliance and save approximately 20% annually by
necessary resources required to run and maintain the generating product labels more quickly and accurately
company’s onsite-hosted database environment • Delivered multi-tenancy capabilities and
• Ensure Pragmatyxs’ systems run at peak efficiency improved data security, which is critical for Prag-
24/7 to enable it to provide its Fortune 500 clients matyxs’ clients who risk regulatory penalties if labels
with solutions and resources that fit their complex are not accurate
requirements • Achieved a competitive advantage by delivering
new solutions to the market faster and at lower cost
RESULTS thanks to the ability to spin up DevTest environments
• Migrated Pragmatyxs’ on-premises Oracle Data- much more quickly
base and Oracle Java environments to the Oracle Cloud • Laid a foundation to support mobile capabili-
to reduce the amount of time that its IT team spent on ties as well as to take advantage of Oracle Database
maintenance and support, without incurring upfront Cloud Service’s corresponding services, including mes-
hardware costs or having to perform complex tasks such saging, integration service, and document services n
as upgrades and patching—lowering IT costs by 50%
16 OCTOBER/NOVEMBER 2017 | DBTA Sponsored Content

Today’s DBA and the


Challenges of Managing
Multiple Database Environments
Today, DBAs are tasked not only with
administering more and more databases,
but also with exploring and deploying a
much broader variety of databases. After
all, adding platforms helps organizations
avoid over-exposure to the contract
negotiation leverage of any single vendor.
On top of that, DBAs themselves often
push for adding in open-source solutions,
many of which are fully mature and offer
the transparency, scalability, flexibility and
support that both DBAs and enterprises
are looking for.
However, this strategy comes at a price:
Large, heterogeneous environments can
be a serious challenge to manage. If you’re
a DBA, database performance is one of
your top responsibilities, and system
uptime is likely the key metric by which
you are measured. How can you ensure
high performance and availability in the
cross-platform database environment
you want to implement or are already
overseeing?
You don’t have time to constantly
learn new native or third-party database
monitoring tool sets, let alone juggle
multiple tools on a daily basis, hoping you
can keep them straight and not make any into a single comprehensive view so you 4 Resolve performance issues across
errors. Moreover, multiple separate tools can ensure consistently high database database platforms by determining
simply can’t give you the global view you performance and availability. With a low your most critical problems and taking
need to quickly troubleshoot problems, overhead agentless architecture, you’ll immediate action.
prioritize issues and meet your service- get a wealth of information without all 4 With one click, limit your view to just
level agreements (SLAs). the hassle and expense of native tools one database platform or a particular
You can simplify database performance and point solutions, which means you database server. Or set up groups of
monitoring and management—even as spend less time fighting database fires and database servers and click to view just
your database environment becomes can implement a proactive approach to your production servers.
increasingly complex—with Quest improving the health and performance of 4 Quickly drill down to investigate
Foglight for Databases. your database environments. platform-specific database health or
The Foglight solution is built to Unlike database-specific point solu- performance issues—both real-time
support both small and large enterprise tions, Foglight for Databases delivers and historical—in detail.
environments with thousands of a coherent, global view of your entire 4 Get integrated management and
databases. It standardizes performance database environment, with consistent, performance views that enable you
monitoring and diagnostics across a powerful functionality. No matter what to understand database health and
broad range of platforms, including combination of supported databases you’re activity.
Oracle, SQL Server, DB2, SAP ASE, managing, and whether your databases are 4 Receive alerts about deviations from
MySQL, PostgreSQL, MongoDB and deployed on premise or in the cloud, you’ll normal activity during different time
Cassandra, consolidating information get all of the following capabilities: periods and track performance with
The average enterprise has 100-500 database instances running At 60% of enterprises, the number of database
today across multiple database platforms instances each DBA is responsible for managing
is growing in size and variety
INSTANCE
System Global Area (SGA)
SharedPool Large Pool

x
Library Cache
Shared SQL Area Private
SQL Area
SELECT * FROM (Shared
employees Server Only)

Data Server Other Reserved Response Request


Dictionary Result Pool Queue Queue
Cache Cache

Database
Buffer Cache Redo
Fixed
SGA
Java
Pool
Streams
ms
Pool
or
Log INSTANCE
Buffer INSTANCE

INSTANCE INSTANCE

INSTANCE
PMON SMON DBWn LGWR CKPT Others

Background Processes

At 60% of enterprises, the size of DBA teams


is not growing

DATABASE

DBA TEAM
Data files Control files Online redo log files

automatic detection and calculation of 4 View wait-event data down to the solution that grows with your database
normal ranges for all metrics. statement level to rapidly resolve environment. Simply start with the car-
4 Resolve performance issues by easily resource-related performance tridges you need now, and add new ones
navigating through diagnostics and problems. as your environment grows and changes.
alarm data from any drill-down screen. 4 Stay alerted to critical issues with Each Foglight cartridge includes
4 Use the data collected by Foglight to out-of-the-box alarms, including platform-specific functionality along with
develop customized views, reports, baseline deviation alarms, which built-in expertise that dramatically short-
and alarms. provide detailed information for ens the learning curve, so you’ll be quickly
4 Drill into the data cube to easily troubleshooting. Easily add alarms, able to quickly ensure the same high
investigate database workload. View including alarms based on your own performance and availability for your new
every dimension of your data, includ- scripts. Search for past solutions, set up databases using the same familiar tool and
ing users, programs, SQL and sessions blackouts, and manage and annotate workflows, without having to invest time
(available for Oracle and SQL Server). your alarms. and money learning all the minutia associ-
4 Review and investigate changes to serv- 4 Integrate your other end-to-end enter- ated with each new database technology. n
ers, instances, databases and schema, as prise monitors seamlessly.
well as application SQL degradations. Foglight for Database offers the broad WANT TO LEARN MORE?
Use customizable alerts to keep abreast coverage you need, with built-in plat- www.quest.com/foglight
of critical changes. form-specific expertise. It’s a modular -for-cross-platform-databases
18 OCTOBER/NOVEMBER 2017 | DBTA Sponsored Content

Forrester Survey Highlights


Trend Towards Hybrid
Memory Architecture
Digital Transformation initiatives are that simplifies the movement and storage plan to spend greater than 16% over
well underway at global enterprises in of data without requiring a caching layer. the next year or two.
industries such as Financial Services, Telco, Early-adopters are seeing several benefits • Hybrid memory architectures
Retail, and Manufacturing. Companies are to their business like a lower cost of deliver better consistency, reliability
deploying real-time Systems of Engagement ownership, huge reductions in their server and low-latency access to active data.
(SoE) applications to better serve their footprint, simplified administration and
RECOMMENDATIONS:
customers, employees and partners and improved scalability.”
• Focus on hybrid memory architecture
to increase their competitive advantage. Aerospike recently commissioned
that delivers better consistency,
Examples of SoE applications include stock Forrester Consulting to conduct a study
reliability, performance and scale.
trading and fraud and risk assessment in surveying North American organizations
Hybrid Memory Architecture is ideal
Financial Services, optimizing network to determine the challenges they face with
for applications that need extreme low-
resources and improving the customer their current in-memory architecture and to
latency access to critical data. Use these
experience in Telco, hyper-personalization evaluate the awareness and implementation
solutions to scale-out horizontally to
and ad targeting in Retail, and IoT of hybrid memory architecture. Here are the
leverage low-cost servers yet deliver the
device monitoring and management in key findings and recommendations from
performance and scale needed.
Manufacturing. the survey study titled Hybrid Memory
• Look for hybrid memory solutions that
The ability to perform analytics in Architecture Drives Real-Time Systems
meet your requirements for uptime,
real-time on transactional data is critical. of Engagement: Replace Traditional
data consistency, reliability, performance
SoE applications require a database In-Memory Architectures To Better Win,
and that have a lower TCO.
architecture that can analyze billions of Serve, And Retain Customers.
• Use hybrid memory solutions to
records across millions of transactions
KEY FINDINGS support multiple workloads such as
per second and return decisions within
• Challenges with current in-memory operational and analytical.
milliseconds. But traditional database
architectures include high cost of
platforms are not capable of handling
ownership, issues with reliability and ABOUT AEROSPIKE
these new requirements. An RDBMS or
uptime, and an increasing number of Aerospike is the only database that
first generation NoSQL database with
databases. reliably handles the demands of SoE
an external caching layer often cannot
• Many companies are planning to applications. Aerospike’s Hybrid Memory
meet SLA requirements for speed, uptime
expand their memory-based database Architecture combines solid-state drives
and consistency. In-memory databases,
architecture investments to support (SSD) and DRAM to achieve the sustained
which store data in DRAM, are a better
growing business needs. Companies performance that SoEs require—with a
solution but suffer from performance and
are planning to increase their spending significantly smaller footprint. Aerospike’s
reliability issues.
on memory architectures as part of Smart Client™ technology automatically
Hybrid Memory Databases move
their digital transformation initiatives handles complex database processes so
beyond In-memory to meet the
over the next 1-2 years. Sixty-five developers and operations staff can focus
requirements of real-time SoE applications.
percent of decision-makers plan to on the business, not administration.
According to Forrester Research, “Hybrid
increase spending 1-15% and 32% The Aerospike database is successfully
memory architecture is a new
meeting the challenges of the
approach that leverages both volatile
most demanding digital economy
memory (DRAM) and non-volatile
players in Financial Services,
memory such as SSD and flash to
Telecommunications, Retail,
deliver consistent, trusted, reliable and
Manufacturing, Ad Tech,
low latency access to support existing
eCommerce, Gaming, Oil and
and new generations of transactional,
Gas, Media and Publishing. n
operational, and analytical
applications. Implementing this type
For more information about
of architecture allows organizations
AEROSPIKE and to download
to move from a two-tier in-memory
the survey study, please visit
architecture to a single tier structure
www.aerospike.com
Sponsored Content OCTOBER/NOVEMBER 2017 | DBTA 19

Deliver Better
Oracle Application
Performance
Every time you roll out new code, reductions (20:1) are produced in
YOU CAN DO IT—AND THE
you’re rolling the dice. Good code? virtualized desktop environments,
ANSWER STARTS WITH FLASH
Bad code? Will it work properly? If allowing even more data on less storage
STORAGE. YES, FLASH STORAGE.
you’re like many application owners, hardware. Incremental backups are
Application owners can now
you won’t really know how your code quick and complete, taking less storage
deploy nearly cost-free real-world
will perform until it’s fully deployed capacity to create and store.
Oracle database test & QA database
in a production environment. Sure,
environments for their developers by A WINNING COMBINATION
your QA team tested the code. But did
deploying applications on the Pure The combination of advanced
they do it in a full-scale production
Storage FlashArray. This prevents software deduplication and
environment?
problems from entering production compression of the advanced flash
environments and unleashes sustainable storage array frees up significant
BAD CODE: SLOW APPS business agility. storage capacity. This extra capacity
It turns out that sub-optimal Flash storage—solid state hard drives allows for production workload capable
application code causes some 70-90% similar to those in smartphones— development and QA environments—
of application performance issues, delivers blazing fast response times without the hardware overhead that was
according to experts. This spawns that improve database performance previously a requirement.
cascading problems in production with, for example, a 10x reduction in In this scenario, teams eliminate the
systems, including slow response times database backup/recovery time and a need to procure additional hardware
for users, declining satisfaction, and 3x improvement in the performance of budget for testing environments. They
lower usage rates. And, while your dev, transactional and business intelligence also dramatically reduce the staff time
database, and infrastructure teams applications. required to clone and load production data
are fixing the problems, they’re not Over the last several years, server workloads; this now takes only minutes
providing new, expanded, or enhanced and networking equipment have instead of days with advanced arrays.
application capabilities to grow and advanced at a rapid pace, but traditional Your dev and QA teams can test
improve your company’s business. spinning disk storage has not kept up. application code in accurate, simulated
Thus, SLAs, agility, and responsiveness Your application may have all of the production environments. They’ll be
suffer. server capacity and network bandwidth able to proactively identify and fix load-
Expert Oracle DBAs can attest it needs, but still lack adequate related problems—before code goes into
that many application problems storage horsepower, creating a major production. With accurate load-testing,
would be solved by testing and QAing bottleneck. you’ll deliver applications that are more
new application code using actual When you replace spinning disks stable, scalable, and robust.
production workload and hardware. with flash arrays with advanced Now, the moment of truth has come
Testing code with a production capabilities too, you can actually create and gone. Your new application is up
workload identifies big problems copies of the production database in and running. Users are, well, using the
so that programming can be fixed a fraction of the time. Advanced flash application. Business executives are
before deployment. The problem is arrays not only speed up applications pleased. Your infrastructure team is
that it takes time to procure, set up, with faster and more resilient storage, relieved. And you and your team are
and configure databases and storage they also give companies the ability proud.
hardware, and then load all of that to create copies of the production
production data into test databases. database—without time-consuming YOU DID IT!
But, what if your team could test backup/restores and without slowing Now you’re ready to work on the
with production data at scale—without down production systems. next application so you can move your
breaking the bank? You’d deliver better For example, advanced de-dupe and business forward. Today has been a
applications with fewer performance compression capabilities from Pure great day. n
issues. And you’d approach each Storage further squeeze compressed
moment of truth with greater data by 2:1 and uncompressed databases PURE STORAGE
confidence in the result. by a factor of 4-5:1. Even greater www.purestorage.com
20 OCTOBER/NOVEMBER 2017 | DBTA Sponsored Content

Intelligent Query Routing


For Scalable, High-
Performance Data Access
In the digital era, where rising user load balance reads, with or without based on rules and syntax—type, table,
experience expectations have to be weighting, to other database servers. column, time and frequency—from
met while at the same time protecting With the use of connection pooling, being executed while the data masking
personal data, database proxies have persistent connections and caching, plugin can be enabled to replace the
become a critical component of a database proxy can further improve values of specific column values with
modern data infrastructure. performance, thereby reducing the generic values. For example, replacing
A database proxy, deployed cost associated with creating database the value of a Social Security number
between applications and databases, connections and reducing the database column with XXX-XX-XXXX. The
provides two critical capabilities: workload by returning cached results result limiting and firewall plugins can
query routing and query/response without having to query the database. be used to prevent denial of service
filtering. Because the database proxy MariaDB MaxScale, an advanced and distributed denial of service
is responsible for routing queries, the database proxy, is built on a attacks as well, limiting the number of
underlying database infrastrastructure multi-threaded architecture using connections, query rate and result size
can be changed without modifying asynchronous I/O and epoll to scale to prevent the database and/or network
applications—enabling database on modern, multi-core processors. from being overwhelmed.
administrators to add/remove database By using threads to manage client In addition, the change-data-capture
servers, perform database upgrades and database connections, MariaDB plugin can stream database events as
and failover/switchover databases on MaxScale can support thousands AVRO objects or JSON documents to
demand. of connections, and as the number external systems (e.g., Apache Kafka)
In addition, the database proxy of processor cores increases, more by tapping into the replication process
is in a position to intercept both applications with higher query in MariaDB Server.
queries and query results, enabling the throughput and lower query latency. In a recent benchmark, using
database proxy to prevent queries from MariaDB MaxScale uses automatic sysbench, MariaDB MaxScale proxied
reaching the database, or to modify the topology detection and dynamic over 300,000 queries per second for
results before they are returned to the configuration to support intelligent 1,024 client connections on a 16-core
application. For example, when acting routing. In a master/slave database processor—nearly as fast as direct
as a firewall, the database proxy can deployment, it can identify which database access. With query caching
block queries by malicious attackers database is the master (where to enabled, it processed 900,000 queries
attempting to corrupt or delete data. route writes) and which databases per second—about three times faster
In the case of personal data (e.g., a are the slaves (where to route reads). than querying the database.
Social Security number or credit card In a multi-master deployment with MariaDB MaxScale is the world’s
number), the database proxy can clustering, it can automatically most advanced database proxy, capable
use pseudonymization to modify the designate one of the database servers to of providing the performance and scale
results before they are returned to the be the master (to avoid conflicts) while required to meet user expectations
application. routing reads to the other database while at the same time providing
While query routing and query/ servers. In addition, it can detect the security capabilities necessary to
result filtering provide a number of when a database server has failed and protect personal data and prevent data
benefits when it comes to managing optionally execute a script to failover breaches—and the revenue/reputation
databases and protecting personal to a different database server. Further, costs associated with them. MariaDB
data, performance and scalability are MariaDB MaxScale supports schema- MaxScale is part of MariaDB TX, an
required to meet rising user experience based sharding, routing queries to enterprise database solution built on
expectations. With intelligent routing, different database servers based on the MariaDB Server, and is compatible with
a database proxy can not only use schema being queried. Oracle MySQL. n
multiple database servers to scale reads MariaDB MaxScale is engineered
and/or writes, it can route reads and for both performance and extensibility, Learn more about
writes to different database servers. using plugins and protocols to provide MARIADB MAXSCALE
This enables the database proxy to extended capabilities. The firewall https://mariadb.com/products
route writes to one database server and plugin can be enabled to filter queries /technology/maxscale
Sponsored Content OCTOBER/NOVEMBER 2017 | DBTA 21

The Evolving
Open Source
Database Landscape
Databases are moving toward open source regulations. Many security officers equate POLYGLOT ARCHITECTURES
deployments that require less maintenance compliance with security. This isn’t true, “Polyglot persistence” simply means
to power the applications and websites that and compliance doesn’t prevent data theft. using multiple data storage technologies
enterprises use every day. Businesses need The rules for (and problems with) good working together. The cost, of course, is
to be mobile and flexible when it comes to security haven’t changed. Companies must environmental complexity. But the benefits
their application and web use—and this continue to ensure seamless encryption of flexibility, mobility, and adaptability can
mobility must be mirrored in their database (both in-transit and at rest) with as little be worth it. And, having the ability to both
environments. overhead as possible. quickly scale up and out by employing both
The following are important developing EFFICIENCY NoSQL and relational databases can be
areas in the database landscape: Two of the ways to achieve database advantageous.
THE “CLOUD” efficiency are speed and compression. As Polyglot persistence will become much
Cloud-based solutions are growing per-gigabit prices drop compared to spin- more common. Organizations can no lon-
and encompassing more and more of the ning drives, SSDs are becoming the go-to ger be only a MySQL or MongoDB shop.
database landscape. The cloud reduces the hardware of choice for speed. ACCESS CONVERGENCE
management and maintenance overhead For compression, the interplay between The use of multiple technologies in a
for companies looking to control costs. But algorithm and block size continues to database environment increases database
this comes with concerns such as over- and improve—as we’ve seen this year with access convergence: the ability of databases
under-provisioning, getting proper support, Snappy, Zstandard, and LZMA. to serve and use data stored in a foreign
and ballooning costs. It is still important AUTOMATION format (for example, moving relational data
to understand what is going on with your Automation features ensure you can into a NoSQL environment, and vice versa).
database architecture and environment. scale performance in the event of high user The MySQL world is seeing an increase
CONTAINERIZATION or query concurrency, optimize perfor- in NoSQL access patterns via mysqlsh (the
Containers are lightweight alternatives mance for dashboarding and reporting, mysql shell), and formats such as GeoJSON.
to full machine virtualization. This pro- implement data analytics, distribute data The new MySQL document store feature
vides many of the same benefits of virtual and manage metadata and implement high also looks promising. PostgreSQL uses
machines, and the application can be run availability and disaster recovery. JSON functions and foreign data wrappers.
on any physical machine without worrying Many fully synchronous replication The Hadoop world is, of course, focused on
about dependencies. solutions are out there already (Percona getting more SQL on top, via Spark.
While Docker was the first big player, it XtraDB Cluster and MySQL Group Replica- MONITORING
isn’t the only container option: CoreOS’s tion), but look for more in the semi-syn- Database customers will look for
Rocket, LXC, Project Atomic, and others chronous replication area via tools such as monitoring tools that can oversee and
exist. Ubuntu has announced the LXD Orchestrator, MHA, and others. provide insight into multiple technologies.
container engine for its version of Linux, SCHEMA CHANGE AUTOMATION With polyglot persistence, the movement
and Windows Server has Drawbridge and Applications that rely on unknown or to the cloud, access convergence, and the
Spoon. Kubernetes is an open source con- undocumented schema changes offer a rec- overall increase in database options that can
tainer cluster manager that can automate ipe for disaster. They lead to rushed fixes for be used in a single environment, a useful
deployment, scaling, and operations of untested or unexpected workloads. Instead, monitoring tool looks at the status and per-
application containers across clusters of you can use environmental intelligence formance of relational, NoSQL, and other
hosts. to understand how changes will impact databases simply and easily.
SECURITY AND ENCRYPTION applications in production or any other The open source database landscape is
With more and more systems, applica- environment. trending toward more flexibility, with varied
tions, and processes going online for remote Many companies are creating methods technologies all working together to achieve
access, there is more data that is exposed of automatically altering the schema via specific goals. With this agility comes a need
to breaches. Many industries—health- scripts. These approaches include oak-on- to easily manage, monitor, and troubleshoot
care, financial services, government, and line-alter, pt-online-schema-change, and the database environment. n
insurance—have mandated compliance now gh-ost.

Anda mungkin juga menyukai