Welcome!

Mobile IoT Authors: William Schmarzo, Liz McMillan, Zakia Bouachraoui, Elizabeth White, Yeshim Deniz

News Feed Item

Tech Alert: Using Software-Defined Storage in Hyper-Scale Environments

FalconStor Experts Offer Tips on Architecting Modern Data Centers for Hyper-Scale Requirements

MELVILLE, NY--(Marketwired - February 03, 2016) -  Storage workloads in modern data centers increasingly require scale-out environments to run demanding enterprise applications. These hyper-scale architectures can benefit greatly from software-defined storage (SDS) in terms of economic value, flexibility, and operational efficiency, according to experts at FalconStor Software® Inc. (NASDAQ: FALC), a 15-year innovator of software-defined storage solutions.

Scale-out workloads such as No-SQL, online transaction processing (OLTP), cloud and big data analytics are hungry for performance and capacity to provide appropriate service levels to end users and applications. The architecture of a hyper-scale data center that must grow to meet compute, memory, and storage requirements on-demand often depends on the nature of the applications and business priorities, such as flexible capacity, security, and uptime. Projects typically are driven by the overall cost of ownership, particularly as requirements reach hundreds of petabytes.

"Many modern applications that need these hyper-scale scale-out environments offer built-in resiliency, protect themselves from hardware failures, and can self-heal, which eliminates the need to build in high-availability at the storage layer," said Farid Yavari, Vice President of Technology at FalconStor. "That opens the door to using consumer-grade, commodity hardware that can fail without impact on service availability. On the other hand, the relatively smaller footprint of revenue-generating scale-up applications may justify paying a premium for name brand storage with HA and data protection features, because it's unwise to test radical new technologies in that environment."

Properly architected SDS platforms enable the use of heterogeneous commodity hardware to drive the lowest possible cost, orchestrate data services such as replication, and create policy-driven, heat-map based tiers, so data is placed on the appropriate storage media. An SDS approach eliminates the reliance on expensive, proprietary hardware and vendor "lock-in."

The two most common models for scale-out hyper-scale storage are either direct-attached storage (DAS), or a disaggregated model based on various protocols such as Internet Small Computer Systems Interface (iSCSI) or Non-Volatile Memory Express (NVMe). Some very large custom data center installations at companies that have the right protocol-level engineering staff run on homemade, workload-specific protocols specifically developed to optimize the storage traffic for their custom use cases. Since the available slots constrain the DAS model in a server, the scale is limited and often quickly outgrown. In the DAS model, independent scaling of compute and storage resources cannot be optimized. Therefore, enterprises start with, or must ultimately move to, disaggregated storage models built with commodity hardware. SDS adds intelligent orchestration and management to the disaggregated data center via an abstraction layer separating heterogeneous storage hardware from applications. SDS results in a more resilient, efficient, and cost-effective infrastructure. The fact that SDS is hardware agnostic allows enterprises to implement new storage technologies in a brown field implementation, eliminating the need for deploying greenfield infrastructure when initially migrating to newer storage models. Using SDS capabilities, the migration from legacy to modern technologies can happen over time, maximizing Return on Investment (ROI) in an already established storage infrastructure. SDS provides flexibility in data migration, seamless tech refresh cycles, and independent scaling of the storage and server resources. Even where data protection and high availability (HA) capabilities aren't necessary, SDS can provide other valuable features such as actionable predictive analytics, Wide Area Network (WAN) optimization, application-aware snapshots, clones, Quality of Service (QoS), deduplication and data compression.

"Software-defined storage solutions blend well with hyper-scale infrastructures built to meet growing requirements for storage flexibility, density and performance," said Yavari. "Falling prices of flash, the introduction of various flavors of storage-class memory, and an increasing appetite for commoditization of the data center infrastructure has helped fuel possibilities in hyper-scale storage. SDS enables deployment of storage technologies with different capabilities, at various cost points, to drive the lowest possible Total Cost of Ownership (TCO)."

FalconStor's FreeStor® delivers enterprise-class, software-defined, intelligent data services combined with predictive analytics across any primary or secondary storage hardware, in the cloud or on premise. FreeStor helps IT organizations realize more economic value out of existing environments and any future storage investments while maximizing flexibility and operational efficiency.

About FalconStor
FalconStor® Software, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged data services software platform that is hardware agnostic. Our open, integrated flagship solution, FreeStor®, reduces vendor lock-in and gives enterprises the freedom to choose the applications and hardware components that make the best sense for their business. We empower organizations to modernize their data center with the right performance, in the right location, all while protecting existing investments. FalconStor's mission is to maximize data availability and system uptime to ensure nonstop business productivity while simplifying data management to reduce operational costs. Our award-winning solutions are available and supported worldwide by OEMs as well as leading service providers, system integrators, resellers and FalconStor. The company is headquartered in Melville, N.Y. with offices throughout Europe and the Asia Pacific region. For more information, visit www.falconstor.com or call 1-866-NOW-FALC (866-669-3252).

Follow us on Twitter - Watch us on YouTube - Connect with us on LinkedIn

FalconStor, FalconStor Software, FreeStor, and Intelligent Abstraction are trademarks or registered trademarks of FalconStor Software, Inc., in the U.S. and other countries. All other company and product names contained herein may be trademarks of their respective holders.

Media Contact:
Scott Kline
JPR Communications

818-798-1474
[email protected]

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...