Drowning in Data and Starving for Information

The ability to generate and capture various sensor data, communications, and imagery has grown exponentially, and there does not appear to be any change in sight. To effectively extract the maximum benefit from the deluge of data generated today, analysts need to utilize the new generation of analytical tools and human expertise that is rapidly being developed. When properly utilized, these advances are providing today’s analysts with an unprecedented ability to make real-time data-driven decisions as well as improved options to collaborate and share information. However, effective implementation in the era of hyperscale data depends on the ability to manage conventional system technology, and unpredictable volume requirements. An imbalanced analytics system can result in bottlenecks that reduce the value of analytics tools and the enterprise data warehouses (EDW) that power them. As such, it’s critical for organizations to ensure that at either end of the ingest/egest process they build scalable storage and storage processing technology to complement today’s high-speed data warehousing and data analytics technology. This session will detail how new filing system options and cloud infrastructure can improve the performance of analytics applications utilizing actual benchmark tests, and demonstrate new options to collaborate and share information.

For many companies one of the biggest decisions they will make, or have made, when moving to the cloud will be how they implement their cloud storage infrastructure. Issues around ease of access, uptime, security, redundancy, SLA, maintenance, and total cost of ownership drive the decision to one of three options: build, buy, or rent. This webinar will explore these issues and compare them for the three options helping you optimize your plan for your business needs and growth.

Please join Forrester Research and DataDirect Networks for this educational webinar to learn about the challenges and solution options for implementing storage in the cloud.

The ability to generate and capture various sensor data, communications, and imagery has grown exponentially, and there does not appear to be any change in sight. To effectively extract the maximum benefit from the deluge of data generated today, analysts need to utilize the new generation of analytical tools and human expertise that is rapidly being developed. When properly utilized, these advances are providing today’s analysts with an unprecedented ability to make real-time data-driven decisions as well as improved options to collaborate and share information. However, effective implementation in the era of hyperscale data depends on the ability to manage conventional system technology, and unpredictable volume requirements. An imbalanced analytics system can result in bottlenecks that reduce the value of analytics tools and the enterprise data warehouses (EDW) that power them. As such, it’s critical for organizations to ensure that at either end of the ingest/egest process they build scalable storage and storage processing technology to complement today’s high-speed data warehousing and data analytics technology. This session will detail how new filing system options and cloud infrastructure can improve the performance of analytics applications utilizing actual benchmark tests, and demonstrate new options to collaborate and share information.

Higher education institutions around the world are in the process of digitizing, indexing and adding searchable metadata to their digital collections. As these data sets expand to hundreds of millions of immutable objects and petabytes of data, there is a need to consolidate and automate to maximize the value of this capital expenditure.

The Integrated Rule-Oriented Data System (iRODS) is a popular open source policy-based data management system (application) that enables data sharing, publication and preservation and is also capable of implementing an automated data analysis pipeline. For each type of data management application, policies or rules can be applied that control the execution of procedures that enforce desired collection properties. By evolving the policies, a collection can migrate through the stages of the data life cycle.

DataDirect Networks - Storage Fusion Architecture is a highly scalable storage platform capable of hosting iRODS in a tightly coupled solution with a RAID controller and parallel file system providing an appliance like solution that can scale to multiple-petabytes in a single name space.

For large storage systems, the massively parallel open source file system Lustre provides an incredible level of performance, flexibility and scalability. It’s a tested and proved technology that it is currently used by over 60% of the TOP100 supercomputing sites in the world. More high-end systems are coming online all the time.

However, in practice, Lustre’s known for being a little “hard to use.”

Interested in the advantages of this powerful filesystem but concerned about day-to-day administration issues? Looking to evaluate the technology but need answers to better evaluate? This one-hour webinar will cover a wide range of issues, but is intended to be open to your questions. What items should you monitor in a Lustre system daily? How do you configure user quotas? How do you automate failover with heartbeat? How do you restore Lustre from file based backups? What are some good tips for tuning Lustre specifically to your environment?

Whamcloud provides support and training services and lives and breathes Lustre. Whamcloud partners with DataDirect Networks to provide solutions that fix the headaches associated with Lustre administration. Come and find out core tips and tricks in building a rock solid Lustre environment that will work specifically for you and your storage environment.

This webinar will be led by Zhiqi Tao, Senior System Engineer at Whamcloud, now owned by Intel, and Dr. James Coomer, DDN’s Senior Technical Advisor for EMEA. Thirty minutes will be spent offering some key tips and trick for Lustre administrators. The session will then be opened up to the audience for 30 minutes for questions.

Join DDN experts to see how organizations are leveraging developments in storage infrastructure to extract the greatest possible value from their data. Material covered will include general architectural concepts on building storage infrastructure for big data analytics, as well as a detailed discussion of real world applications and benchmarking results with SAS and Vertica platforms. Specifics on the impact to data ingest speed, query performance, flexibility, ease of management and overall scalability will also be covered.

This webinar will explore methods for minimizing latency and accelerating Big Data processing, analytics, and distribution using high performance, InfiniBand-connected storage systems, specifically in web scale data centers. We will review the network and system factors influencing performance today, and present storage and interconnect solutions that minimize process latencies, decrease time to results and increase ROI for Big Data applications.

Please join Mellanox and DDN for this educational webinar to learn about the concepts, challenges and performance options for Big Data analytics applications.

Join DDN, FilmPartners, and Broadcast Engineering for an in-depth discussion on the Unified Workflow solution created by tightly integrating DDN's xSTREAMscaler and MXFserver 5.0. With experts from FilmPartners and DDN, moderated by Michael Grotticelli of Broadcast Engineering you will not want to miss this webinar as we go over all the details, take questions from the audience and show examples of real-world success.

Join DDN, FilmPartners, and Broadcast Engineering for an in-depth discussion on the Unified Workflow solution created by tightly integrating DDN’s xSTREAMscaler and MXFserver 5.0. With experts from FilmPartners and DDN, moderated by Michael Grotticelli of Broadcast Engineering you will not want to miss this webinar as we go over all the details, take questions from the audience and show examples of real-world success.

Learn in this webinar how DDN has raised the performance bar (again) in announcing the SFA 12K(TM) series of scalable storage platforms. The SFA12K product family is a series of systems which have been designed from the ground-up for the next wave of data creation, distribution and processing. With this offering, DDN has established a clear leadership along a number of key dimensions
Big Data Innovation We're now delivering the features that our competitors only dream of having, including our In-Storage Processing™ and Storage Fusion Fabric™. These solutions not only greatly simplify big data storage environments, but also enable applications to excel.

Performance The SFA12K is 2.7 times faster than it's next-closest competitor... and oh by the way, that's our own SFA10K-X. The system is up to 800% faster than competing systems.

Data Center Efficiency The SFA12K can reduce data center requirements and is astonishingly 240% more efficient than competing scale out technology

Engineered Scalability With a focus on managing large scale systems, technologies such as DDN's SATAssure™, ReACT™ and new Quality of Service serve as validation of the ~1M lines of code we've built into our platform that makes managing large data sets easiest with the SFA

From the Cloud Computing Conference at NAB 2012, Jean-Luc Chatelain describes how Creators can leverage cloud computing for improving and accelerating content creation. Listen and learn why "Cloud is the New Black".

This webinar will cover highlights of the Lustre roadmap and areas of interest on the business side of the equation. What can managers and technology buyers expect from Lustre? Should you upgrade to Lustre 2.2? What news will come out of the Lustre Users Group (April 23-25)? What are the opportunities for using Lustre in commercial HPC markets moving forward? Bring your toughest Lustre questions - a Q&A session with leading Lustre experts will follow the presentations.

Learn how NOAA accelerates their research by using storage from DataDirect Networks. NOAA's research helps keep citizens informed of the changing environment around them from the surface of the sun to the depths of the ocean floor.

“Legal Concerns & Practical Solutions for Media in the Cloud,” a webinar produced by DataDirect Networks in association with the DCIA and Dow Lohnes PLLC Attorneys At Law. This webinar will discuss the latest contractual, copyright, and fourth amendment consequences of using cloud based services. It will also cover possible private media cloud solutions that may help expedite the adoption of cloud storage and collaboration by Media and Entertainment companies.

Storage is a major factor in time to results for analytics workflows. In this webinar, Larry Jones from DataDirect Networks (DDN) will highlight STAC testing results that demonstrate the advantage of high performance storage.

Supercomputing is the biggest event of the year in High Performance Computing, and DataDirect Networks is one of the biggest forces present at SC'11. Hear about the latest products and technology directions from the leader in HPC storage, including the announcement of the SFA 12K product line.

Are you a "cloud builder?" This webinar is specifically for you - learn how scalability and performance challenges can be eliminated upfront by avoiding the pitfalls of using traditional storage for the unique needs of cloud implementation. DDN, the technology leader in cloud storage, will highlight how object storage is the optimal platform for building private and public clouds, as well as setting a path to Exascale computing by moving to a "NoFS" approach. Join DDN to discuss new approaches to cloud storage scalability, and become part of the dialogue around how to remove storage as the bottleneck in your cloud building efforts.

Big Data is causing IT departments to revisit their assumptions about storage, and in many cases to look at new types of storage architectures. This channel will highlight how DDN's customers have achieved success in handling their most difficult Big Data problems