Saturday 29 February 2020

Dell Technologies Joins Sheltered Harbor Alliance Partner Program as the First Solution Provider

If you’re in the data business, and even if you’re not, you know that high-profile cyberattacks in the recent past have resulted in the public leak of huge amounts of stolen data, including employee and customer personal information, corporate intellectual property, and even unreleased films and scripts.

In 2015, in response to the increased frequency and sophistication of cyberattacks on the financial sector, a not-for-profit, industry-led initiative was launched. Its aim was to protect the triad of financial institutions, their customers and general public confidence in the U.S. financial system, against a catastrophic event.

Dell EMC Study Material, Dell EMC Certification, Dell EMC Learning, Dell EMC Exam Prep
With the collaboration of hundreds of subject matter experts, and the backing of major players across the industry, the Sheltered Harbor initiative implemented a standard to protect customer data and provide access to funds and balances in the event of a critical system failure. Its mission complements Dell Technologies’ commitment to helping organizations and individuals build and protect their digital future, and it’s why we were the first solution provider to join.

Sheltered Harbor developed an industry standard for data protection, portability, recovery and the continuity of critical services. Core to the standard, Sheltered Harbor participants back up critical customer account data each night to an air-gapped, decentralized vault. The data vault is encrypted and completely separated from the institution’s infrastructure, including all backups. Participants always maintain control of their own data.

The standard also includes resiliency planning and certification, but implementation of Sheltered Harbor is not intended to replace traditional disaster recovery or business continuity planning; rather, it is meant to run in parallel with these efforts to deliver a higher level of confidence for the fidelity of the U.S financial system.

On January 16, 2020, the OCC and FDIC issued a joint statement on heightened cybersecurity risk, outlining steps financial institutions should take to prepare for and respond to a heightened threat landscape. It also advised them to consider whether their backup and restoration practices meet industry standards and frameworks, such as Sheltered Harbor, to safeguard critical data.

Dell EMC Study Material, Dell EMC Certification, Dell EMC Learning, Dell EMC Exam Prep
Source: Sheltered Harbor

Dell Technologies is not only the first solution provider to join Sheltered Harbor, but also a committed partner to financial players across the globe. How do we help? Ask eight out of 10 global banks who are using Dell EMC Data Protection, or 14 of the top 15 U.S. banks doing the same. They know that the interconnected nature of the U.S. and global financial markets means that any disruption to banking services in one country or region can quickly cascade to others, causing a widespread financial panic.

As increasingly sophisticated ransomware and other cyberattacks continue to threaten customers in every industry, financial services are a huge target. The industry’s dependence on key systems to maintain normal operations clearly illustrates the importance of implementing proven and modern strategies and solutions to protect the most critical data.

The Dell EMC PowerProtect Cyber Recovery Solution for Sheltered Harbor helps participants achieve compliance with data vaulting standards and certification, plan for operational resilience and recovery, and protect critical data into the next Data Decade. Consider it your shelter from the potential storm of cyberattack.

Friday 28 February 2020

Dell EMC Data Science Certification That Will Pay Off


emc data science associate, dell emc data science associate, dell emc data science certification, data science associate practice exam, dell emc data science professional certification

Data Science

Data Science contains different disciplines, which include Statistics, Machine Learning, Data Analysis, Computer Science, and Research. If you want to pursue a Data Science career, after reading this, you must be wondering how and where to learn about different fields to advance your career, right? Don’t worry. This may seem daunting if you are entirely new in this field and trying to think out how to start a Data Science career.

But, not all job roles need all the skills. Many job roles and companies will emphasize some skills over other skills; this way, you do not have to learn and be an expert in everything.

Data Science Certification

Certifications are the best method of showing that you understand something in a particular field. Data Science and Analysis career paths and certifications are immediately comparable to each other. Note that Courses and Certifications are two separate things. The former is useful when you require to learn something, while the following is to show you have learned those skills and ready to jump in the industry.

Dell EMC Data Science Associate Certification

Dell EMC provides an associate certification that ensures a hands-on, practitioner approach in what it describes as the industry’s most extensive learning and certification program. Once you pass the exam, you are regarded as a Proven Professional as they say. The Data Science certification path covers advanced level as well.

Dell EMC Proven Professional Certification Program

The Dell EMC Proven Professional Certification Program shows your knowledge of the technology and verifies your skills in making transformations real, reaching business goals, and leading the competition.

The advantages associated with the program include:
  • The program concentrates on technologies that can be applied to all IT environments.
  • Role-based training provides complete knowledge of Dell EMC’s hardware, software, and solutions.
  • Candidates get new and advanced IT technologies that focus on emerging operations, business, and financial models and facilitate secure business communities.

Who Should Obtain the Dell EMC Certification?

  • Cloud Architects
  • Cloud and Storage Administrators
  • Technology Architects
  • Cloud Platform Engineers
  • Implementation Engineers
  • Application Developers

Dell EMC Certification Functions

The Dell EMC certifications are chosen based on an individual’s interest in the associated job function or product/technology family.

The various functions are:
  • Technology/Associate: The certifications cover technologies such as Cloud, Storage, Data Protection, Data Science, and Infrastructure Security. Other associate-level certifications give expertise in Converged Infrastructure, Servers, and Networking.
  • Design: Taking this function enables you to develop skills to analyze and design infrastructure and Cloud Computing solutions. The associated certifications carry certifications in Cloud, Data Storage, Data Protection, and Converged Infrastructure.
  • Deploy: This function enables you to gain skills to form and deploy reliable and sound strategies to ensure a high level of representation. Certifications in Hybrid Cloud, Data Storage, Data Protection, Converged Infrastructure, Software-Defined Infrastructure, Networking, and Servers are the ones connected with this function.
  • Manage: Learn to use and maintain Dell EMC infrastructure solutions to enhance business. The associated certifications hold certifications in Hybrid Cloud, Data Storage, Data Protection, Converged Infrastructure, Software-defined Infrastructure, Networking, and Servers.
  • Support: The function lets you build your skills in installing, managing, and troubleshooting Dell EMC products, technologies, and solutions. The associated technologies combine Data Storage, Scale-Out Storage, Networking, and Servers.

Dell EMC Certification Levels

Dell EMC certifications are allowed in four skill levels:
  • Dell EMC Certified Expert (DECE): Provides advanced experience and skills in varied technologies.
  • Dell EMC Certified Master (DECM): Makes you a subject matter expert in complicated scenarios and multiple technologies.

Job Role: Data Scientist

Nowadays, Data Scientist is often used by hiring professionals as a blanket title to narrate jobs that are entirely different. Reading the job description will allow you to jobs that you are already qualified for. Moreover, develop a specific skill set to match the job roles you want to pursue.

You perform data cleaning, analysis, predictive modeling, and visualization and, in some cases, software engineering duties as well. Data Scientist is the one hire for any consumer-facing company with large amounts of data or companies that allows data-based services.
Read:

End Notes

Data is going to stay here for a while. The world is facing a lack of skilled professionals; having the right skill at the right time will give you an advantage in the industry. Always ask for the moon; even if you miss, you will land between the stars.

Happy Learning..!!

Thursday 27 February 2020

Dell Technologies Wins an Emmy: That’s Good News for Financial Trading and Other Data Analytics Applications

The National Academy of Television Arts and Science has honored Dell Technologies with a 2020 Technology and Engineering Emmy® for the Dell EMC Isilon for driving early development of Hierarchical Storage Management (HSM) systems. This is exciting news for those of you working with artificial intelligence (AI), machine learning (ML) and deep learning (DL) applications of all types. Here’s why.

Back in the early 2000s, media and entertainment companies began to require large-scale shared storage to handle the increasing frame resolution demanded by viewers and to support a proliferation of media formats. Technology has progressed to accommodate the increasing demands of these workloads, and the result is advanced storage like Dell EMC Isilon.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certification, Dell EMC Learning

What’s exciting is that the same technology used to accelerate video processing can also serve as the critical foundation for modern data analytics applications. You can’t have Big Data if you don’t have Big Storage. But larger storage simply isn’t enough to deliver performance at scale.

Let’s look at the financial trading industry. Today’s trading firms are transitioning their algorithmic models from intraday to multi-day trading. This, coupled with ongoing exponential growth in daily transactions, means that financial systems can no longer store their active trade data sets (> 10 TB) in memory. Without advanced storage, firms would need to reduce the data sets they are working with, give up near real-time performance, or limit the number of simultaneous processes (e.g. concurrency).

This isn’t just a problem for financial trading. Utility companies need to work with increasingly greater data flows as more and more meters are brought online. Massive numbers of smart sensors are being deployed in Industrial 4.0 applications, smart buildings, and smart cities. There is more data to collect, and as a result there is more data to process. If this data is going to be useful, it must be processed in as close to real-time as possible. After all, it’s pointless to learn that a stock is a good buy or that a system is about to fail after the window of opportunity to act is past.

What’s extraordinary about Dell EMC Isilon is how it overcomes the limitations of previous memory architectures. HSM automates movement of data between high-speed (i.e., high-cost) storage and lower-speed storage to balance cost with access speed. HSM requires massively parallel I/O to achieve this and avoid bottlenecking the system.

With up to 945 GB/s and 17M IOPS, Isilon provides industry-leading storage efficiency of 85 percent. Isilon’s outstanding performance crosses the board, as shown by its impressive STAC benchmarks. When combined with Dell EMC PowerEdge servers and Nvidia GPU technology, the results are near real-time with high currency for even the most demanding AI and ML applications.

Two other important factors are scalability and transparency. Isilon scales from 10s of TB to 10s of PB. In addition, HSM is handled transparently. With these capabilities, Isilon is able to serve as a Data Lake for running analytics, enabling developers to focus on what to do with data rather than expend their resources trying to manage it.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certification, Dell EMC Learning

The benefits to financial trading firms are significant. They can train, validate, and score models faster, reliably share access to a single copy of data, and deploy more advanced and iterative models. Enterprise security and compliance becomes easier to manage. And higher model accuracy directly impacts the bottom line. For example, one large New York city hedge fund has been using Isilon storage since 2007 and is now able to leverage 30.5PB of tick data analytics in their daily operations. They have also seen a significant performance increase (390% more) in throughput, allowing them to run larger, more sophisticated research jobs without any impact to performance. In addition, as the company grows, they are able to scale storage much faster than compute resources.

Data analytics for applications like AI and ML are steadily becoming more important, and storage is the bridge to get there. With the throughput, scalability, and transparency of innovative storage technology like Isilon, there are no bottlenecks to innovation.

Tuesday 25 February 2020

Dell EMC Streaming Data Platform Enables Insights for Streaming Data from the Edge

Streaming Data Creates Massive Potential for Organizations Across Industries


According to IDC, more than a quarter of data created in the global datasphere will be real-time in nature by 2025. Much of that data will come from the edge, originate from sensors, cameras and drones, and come in the form of a continuous data stream.

Streaming data creates additional complexities in the already intricate world of unstructured data. With its tendency to vary in volume and boundaries and with timestamps that can fluctuate out of order, the need for a Data First infrastructure – where organizations know exactly where their data is and how to harness its full potential – becomes more critical than ever. While this type of data has the potential to drive significant innovation within an organization, it also presents those that work closely with enterprise infrastructures with a host of challenges. These challenges can include managing multiple infrastructures, dealing with data inconsistency risks and creating a time-consuming process for application developers.

Imagine providing your developers and operators with a platform purpose-built for them to ingest, analyze, and manage streaming data. Imagine monitoring anomalies in a manufacturing line, an energy field or railroad tracks through sensors – abnormalities of livestock, airplanes and shipping containers tracked through drone video feed – or the ability to adjust traffic lights depending on volume or detect security threats through video streams. These are just a few of the possibilities when an organization creates a Data First infrastructure focused on a proper streaming data foundation.

Dell EMC Study Material, Dell EMC Guides, Dell EMC Learning, Dell EMC Prep, Dell EMC Exam Prep

Capture the Streaming Opportunity with Dell EMC Streaming Data Platform


To help customers turn this unique set of infrastructure challenges into opportunities for deeper business insights, Dell Technologies is introducing the Dell EMC Streaming Data Platform. This all new software offering enables ingestion and analytics of real-time streaming edge data in the data center.

With the release of this enterprise-ready platform, organizations now have access to a single solution for all their data (whether streaming or not) that provides auto-scaling ingestion, tiered storage with historical recall on-demand and unified analytics for both real-time and historical business insights.

Re-Designing Complicated Infrastructures with an Out-of-the-Box Solution


As many organizations have begun to address the rise in streaming data by creating additional infrastructure solutions, they have inadvertently created duplicate data silos along the way that each require individual implementation, management, security and analysis. The Streaming Data Platform unifies those infrastructures into one out-of-the-box solution to streamline data and its management for actionable insights.

The software platform provides enterprise-grade security and serviceability and is built on open-source technologies such as Kubernetes for orchestration, Apache Flink for real-time and historical query processing and Pravega for ingestion and publish-subscribe. The Streaming Data Platform empowers innovation far into the future – creating a programming model that reduces application development time, giving your team more time to focus on innovation and the next level of business needs.

Dell EMC Study Material, Dell EMC Guides, Dell EMC Learning, Dell EMC Prep, Dell EMC Exam Prep

Improving Workflows and Outcomes Enterprise-Wide


As a platform that enables insights to drive innovation, the Streaming Data Platform enables a wealth of opportunities to endless industries – from retail, to agriculture, to automotive and the energy sector. Example uses of this technology include the real-time & historical analysis of video and telemetry streams to monitor livestock health or manage planning of large construction projects. One customer, RWTH Aachen University in Aachen, Germany, creates research projects for nearby small businesses – allowing them to reap the rewards of research that creates more efficiency, deeper insights and frees up time to enable innovation in use cases like the manufacturing example above. “The reduction of management needed to operate the software system and [Streaming Data Platform] stack is crucial … to ease the work of developers and minimize hard DevOps skill requirements,” said Philipp Niemietz, Head of Digital Technologies Research Group at RWTH.

Addressing the Streaming Data Challenge


When one thinks of enterprise storage, file and object are likely the first categories that come to mind. But with the rise of real-time data, it’s clear that in the years to come, storage solutions for streaming data will become a critical need in the enterprise.

Friday 21 February 2020

6 Catalysts that will Change Media and Entertainment in 2020

Dell EMC Study Material, Dell EMC Guides, Dell EMC Learning, Dell EMC Tutorial and Material, Dell EMC Prep, Dell EMC Exam Prep

The Media and Entertainment Industry is undergoing major transformation through the development of new trends, new processes, new ideas and new technologies. These exciting changes in the industry will drive innovation, disruption, and opportunities for growth in media and entertainment.

The drivers that are impacting this transformation are constantly shifting, including how the content is created, the way we share media, our engagement with media, the delivery of media, our consumption of media, and of course, the way that we pay for media.

In the latest Global Entertainment & Media Outlook (pdf), Price Waterhouse Coopers projects that the industry will generate $2.6 trillion in revenue by 2023. It is projected that digital media will represent 60% of total industry revenue, and China will be the leading global player.

The key opportunities, trends and challenges that I see shaping the industry’s transformation include:

Volumetric Videos. It is expected that the next key development in media production is the volumetric video, particularly in respect to the evolving Virtual and Augmented Reality markets. It is also likely to become a mainstream tool for certain Animation, Gaming and VFX work, as point cloud and colour data can be used to create virtual sets and models with increasing ease.

5G Wireless Technology. The rollout of fifth-generation (5G) wireless technology across the globe will present a host of new opportunities across the media & entertainment sector. The technology will help further accelerate the already significant consumption of media on mobile devices. It will also support real-time access to higher quality content, along with opportunities for production techniques that increasing rely on cloud services.

Shift in Advertising Dynamics. Over recent years, there has been significant movement in advertising and marketing dynamics between media and entertainment segments. There has been an explosion in the growth of digital content delivered via paid streaming services such as Netflix and Spotify. People are enjoying the breadth of content, to suit their own schedule – along with the options to consume on a range of platforms. This growth in subscription digital content is having a negative impact upon traditional television and radio advertising spending as advertisers adopt other strategies to reach that audience.

Storage of Data. The industry is re-thinking data storage, driven primarily by the massive volumes of digital content that needs to be stored, shared and manipulated in real-time. Disruptive technology is impacting traditional storage solutions, with rapid growth in volumes of data arising from automation, virtual production and machine learning. Resolution, media delivery platforms and consumer demand for unprecedented access to content, are all part of this exponential growth in storage and performance needs.

With this trend projected to continue in the future, data storage will become an issue if not managed properly. The key issue that needs to be addressed is how and where will all this content be stored. Storage platforms must deal with this exponential growth in unstructured data, provide real-time insights into data usage, and handle mixed file sizes, including the proliferation of lots of small files.

Security and Trust. Trust in the media – particularly news media – is at an all-time low, and repeated breaches of security and privacy across social media networks and advertising giants have made customers weary of how much personal information they are willing to share. In the Animation and VFX space, there have been some very significant and well reported network breaches, along with content and personal information theft. The industry is uniquely vulnerable to cyberattacks, due mainly to the value of the product. Threats such as privacy and data breaches can cause irreparable damage to the company and the industry. Fighting these risks are key motivators for the media and entertainment industry to apply state-of-the-art cyber and hardware security.

Partnering in the Industry. Partnerships in the industry are vital for success. The partnership arrangement can be as simple as an agreement between two companies for production and distribution of content or as complex as a merger and acquisition. Recently media production companies have begun partnering with telecommunication companies for the distribution of their content. Both parties are seeking to monetize this content, perhaps bundling with their existing services and generating valuable Data Capital in the process. These partnerships and acquisitions have been occurring in the Animation, VFX, TV, Cable and Advertising spheres, where often, size offers more resources, reach and ultimately opportunities.

With the rapid rate of change, it has never been more important to have strong relationships, built on trust and a genuine understanding of the challenges facing M&E. After 19 years in the media industry, including 15 years at Animal Logic, I joined Dell Technologies because I believe that it is the only end to end technology company that specifically understands the pressures and technology needs specific to the Media & Entertainment industry.

Thursday 20 February 2020

Think You Need Different Systems for AI, Data Analytics and HPC? Think Again

With the new Dell EMC HPC Ready Architecture for Artificial Intelligence and Data Analytics, you can now run AI, data analytics and HPC workloads on a single system.


In today’s marketplaces, competitive advantage increasingly goes to the data‑driven enterprise. This is particularly true for those enterprises that are ready to seize the day — and capitalize on the convergence of artificial intelligence, data analytics and high performance computing.

The time is ripe for this convergence, thanks to some amazing advances in HPC systems and the applications that can run on them. As HPC clusters become smaller, simpler, and less costly to deploy and operate, enterprise IT teams can use HPC to provide the throughput and capacity they need to accelerate AI, data analytics and other compute-hungry enterprise workloads, like modeling and simulation.

If your organization is on this path, you’re headed in a great direction. When you converge AI, data analytics and HPC on a single system, you gain the horsepower you need to run high‑performance data analytics, simulation, boost high‑frequency trading, help with drug trials, enhance risk analysis, improve fraud detection, collect and analyze data from the Internet of Things, and accelerate motion picture animation and special effects cycles — just to name a few of the opportunities that come with convergence.

And this brings us to the news of the day — the launch of the new Dell EMC HPC Ready Architecture for AI and Data Analytics.

One system for AI, data analytics and HPC


If you’re in an IT shop, you know that system configuration can be a complex task, requiring a delicate balance among workload requirements, performance targets, data center constraints and your IT budget. With the new HPC Ready Architecture for AI and Data Analytics, you’re home free when it comes to these challenges. Engineers from Dell Technologies have done the heavy lifting for you, so you can confidently and quickly deploy one system for AI, data analytics and HPC workloads.

Like all the other Dell EMC HPC Ready Architectures, this new offering is designed by expert engineering teams to simplify system configuration, deployment and management. The HPC Ready Architecture for AI and Data Analytics has been optimized, tested and tuned for a variety of applications on the Kubernetes stack, with ongoing testing and validation to expand the list of validated options.

Flexible workload management enables dynamic movement of jobs between Slurm and Kubernetes based on user demand, with a scalable shared filesystem to support both. Bright Cluster Manager provides a single‑pane‑of‑glass management experience for Dell EMC hardware, Slurm and Kubernetes. The solution includes a set of Ansible and Terraform playbooks for those who like to build via open source tools.

This is a package, not a single point-product. The HPC Ready Architecture for AI and Data Analytics comes with a best-practices guide and toolkit to help you go from a factory‑installed operating system to a full Kubernetes cluster with a repository on GitHub. And with Bright Cluster Manager software, system admins can quickly get clusters up and running and keep them running reliably throughout their lifecycles.

Dell EMC Study Material, Dell EMC Guides, Dell EMC Tutorial and Material, Dell EMC Learning

Great ROI


There is even a return-on-investment (ROI) side of this story. A detailed total cost of ownership (TCO) analysis by Dell Technologies showed that the ability to run multiple workloads on a single infrastructure lowers the cost for hardware, software and maintenance, while providing additional savings from reduced power and cooling costs. This comparative analysis demonstrated that with a converged solution, system users can get faster access to more comprehensive datasets, while IT teams can reduce silos and consolidate operations for up to 3x lower TCO. With this new HPC Ready Architecture, you can now run AI, data analytics and traditional HPC workloads on the same system — simplifying deployment and management while keeping your costs low.

Dell EMC Study Material, Dell EMC Guides, Dell EMC Tutorial and Material, Dell EMC Learning

Tuesday 18 February 2020

It’s Time for a Multi-Cloud Approach that Works for Health IT

Cloud, Dell EMC, Innovation, Hyper Converged Infrastructure, Opinions, Dell EMC Prep

Supporting multiple cloud providers is now a requirement in healthcare. Health IT organizations are being asked to manage cloud-based solutions ranging from SaaS-based applications to collocated equipment at a service provider. A recent survey found 35 percent of healthcare organizations have more than half of their data or infrastructure in the cloud. In fact, there are several reasons to consider off-site cloud providers as the destination for some of your healthcare IT needs.

The typical drivers for using cloud-based products include cost, performance and security. Other considerations are access to capital, staffing and regulatory needs. In some cases, healthcare organizations are not looking to make any further IT investments in data centers and have developed a “cloud first” mentality, but this may lead to an oversimplified view of a more complex challenge.

Just as every organization must have its own specific business model, each healthcare provider’s data strategy must be driven by the needs of its specific clinical and business workloads – not the other way around. Cloud strategies are ever-evolving rather than a simple one-off solution. For example, on-premises solutions are better for risk reduction and monitoring, while off-premises solutions may offer better manageability, ease of procurement and cost. For this reason, every healthcare provider must determine its priorities to optimize a cloud strategy that matches the best location for their data and applications.

A multi-cloud infrastructure offers the ability to identify and monitor information across the entire healthcare data ecosystem through a single pane of glass, simplifying intelligence at the point of care and collaboration among clinicians. This strategy provides a consistent operating model and simplified management across private clouds, public clouds and edge locations, along with the flexibility to adapt to future changes in health IT.

Common Control Plane for Multi-Cloud


Dell Technologies cloud-based solutions have been designed to deliver flexibility and choice – to help cut through the chaos. Our solutions offer access, not just to the hyperscalers (e.g., Amazon AWS, Microsoft Azure, Google GCP), but also to hundreds of VMware Cloud Providers (VCPP). Your private cloud runs on our best-in-breed Dell EMC infrastructure, and the VMware Cloud Foundation further controls the environment so you can seamlessly move workloads among any public cloud provider.

To fully leverage a multi-cloud environment, your organization needs to contain the cloud sprawl to effectively manage the entire application portfolio. We have seen many organizations move to the public cloud without a long-term plan in place, ultimately forcing them to bring workloads back in-house to control cost, performance and security. We recommend a pragmatic approach, assessing applications and workloads for the best landing zone. With a common control plane management interface, your organization can visualize, evaluate costs and control risks associated with the computing environment between multiple cloud providers.

Steps to Building your Multi-Cloud Environment


1. Modernize your in-house infrastructure – this includes choosing best-in-class hardware and software with the most resilient and efficient architecture to build upon. This architecture can be a traditional three tier architecture (separate server, network, storage platforms), engineered converged infrastructure (CI) or hyper-converged (HCI) architectures that have an appliance like design. When you move from traditional three tier to converged to hyper-converged designs, management is simplified, and total operating costs are lowered. HCI offers single button upgrading and patching as opposed to individually patching all the disparate components. Typically, 20%–50% of patches for software bugs can introduce new, unknown problems. The goal of modernizing IT infrastructure is to not only lower costs but provide the most resilient and performant platform to run critical applications.

2. Virtualize the environment – Most health IT operations have virtualized applications or workloads today — in fact 93% of hospitals are already using VMware products. To set the foundation for cloud-like operations, increasing automation and instrumenting the environment is necessary. Applications should be surveyed and qualified for the ability to be cloud-enabled. Assessing the cloud readiness of workloads will provide the basis to determine the optimum path. IT should aim to create a self-service portal where your internal customers can perform basic needs and IT administrators can see how workloads perform. The more automation is placed into an IT environment the more it allows IT operations to be further streamlined.

3. Transform your operating model with automation – Workload placement optimization should be based on criteria of cost, performance and security incorporating application discovery. You should discuss whether a workload is best to stay in-house, move to a private cloud, or public cloud. Using the VMware Cloud Foundation (VCF), can help you place the correct workload in the best destination based on your defined criteria.

Sunday 16 February 2020

Gaining Speed – The Momentum around Data Protection for Kubernetes

Physicists define momentum as “the impetus gained by a moving object.” That may be the technical view, but in the world outside of physics, momentum commonly means “gaining speed.”

Many things thrive on momentum. An underdog sports team can gain momentum and work their way to a championship. Politicians can ride a wave of momentum into office. Even personal relationships build upon their momentum to blossom into love. In business, momentum is seen as a good thing. It’s a sign that you are doing something right, getting the attention of buyers and solving problems for customers.

For any new technology, there are few attributes more valuable than momentum and it’s no secret that the adoption of containers is seeing accelerated momentum. Kubernetes (K8s) usage is gaining momentum as the container platform of choice. Cloud monitoring company DataDog recently surveyed container users to discover that over 45 percent were using Kubernetes – representing a 20 percent increase over 2018.

So, what’s all the fuss? Containers seem to be the answer to the customer problem of how to continually run applications across multiple computing environments. Kubernetes, specifically, is an open source platform that provides orchestration of containerized applications. One of the benefits of Kubernetes is that it is ideal for hosting cloud native applications with requirements for flexible demand and/or rollback of applications.

Enterprise adopters of Kubernetes such as Goldman Sachs, Nordstrom and Comcast (to name just a few) rely on Kubernetes to help transform internal applications as well as make their products more dynamic, scalable and portable. Comcast, for instance, leverages K8s for their cloud DVR service – recording and managing content from millions of customers in multiple geographies – enabling them to become a more agile organization by rescaling, managing data center capacity and deploying content faster.

Whether it’s a cloud DVR or even ordering your favorite coffee via your phone app, IT is focusing on creating new business processes, culture, and customer experiences to meet changing business and market requirements. There’s plenty of orchestration and coordination needed to deliver these changes; all made possible through microservices and cloud native architectures. Kubernetes and containers are fast emerging as a key platform to enable these changes and becoming a key to modernizing development fabrics and IT transformation.

With momentum come challenges


One thing is for sure, Kubernetes is a part of many companies’ transformative plans to modernize. Whether internal or customer facing applications, K8s is a part of their development fabric. The recent “Conference Transparency Report” from KubeCon 2019 by The New Stack showed that of 380 respondents, over 55 percent are deploying database workloads in Kubernetes containers – a 23 percent increase over 2018. This kind of rollout does not come without complexities or challenges.

One such challenge is the backup and restore of Kubernetes. With all this development happening in the background:

◉ How are organizations protecting their Kubernetes workloads?

◉ How are IT teams managing the protection and restore of Kubernetes environments?

Kubernetes momentum and rapid adoption may have left some IT organizations behind as they focus on delivery.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Prep, Dell EMC Exam Prep

To maintain momentum and continue to innovate, IT teams using K8s are challenged to continually release new features. Implementing a backup and recovery strategy is a part of this release process. Dell EMC PowerProtect Data Manager has integrated with Project Velero, a VMware-originated open source tool that can backup, restore, perform disaster recovery and migrate Kubernetes cluster resources, persistent volumes and storage attributes. Dev/Test teams can work from existing data sets.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Prep, Dell EMC Exam Prep

PowerProtect Data Manager enables customers to place either existing production workloads or new workloads into Kubernetes production environments, knowing they will be protected. IT operations and backup admins can then manage Kubernetes data protection from a single management console, as K8s admins define protection for their workloads from the Kubernetes APIs.

PowerProtect Data Manager protects production and Dev/Test workloads in Kubernetes environments, ensuring that the data is easy to back up and restore, always available, consistent and durable in a Kubernetes workload or DR situation. App owners gain the benefit of an intuitive, easy to use UI while IT Ops can take advantage of centralized governance separate from Dev/Ops.

Organizations can protect Kubernetes directly into PowerProtect DD or Data Domain and gain benefits from secondary storage with unmatched efficiency, deduplication, performance and scalability. PowerProtect Data Manager is creating momentum. It is an answer to the data protection challenge for Kubernetes namespaces. It allows organizations to take existing production workloads or new workloads and start placing them in Kubernetes production environments, with the confidence that they will be protected. PowerProtect Data Manager is leading the way, enabling both data owners, K8s admins and IT Ops to discover, define and manage protection of Kubernetes as part of their complete data protection strategy.

As organizations jump onto the wave of momentum with Kubernetes, it is critical that they consider a data protection solution. Regardless of whether it’s IT operations or development teams, the work that is done in Kubernetes requires protection. PowerProtect Data Manager protection is an excellent option to discover, protect and restore K8s containers in production and Dev/Test environments.

Saturday 15 February 2020

Dell Technologies and Splunk: Enabling AI and Analytics Together

I’m a Business Development lead with the Dell Technologies Unstructured Data Solutions (UDS) team. In this role I drive go-to-market and enablement for enterprise data center storage offerings with many of our partners in the Data Analytics and AI Solutions space. One of our key partners in this area is Splunk and I’ve been working with them for quite a while now. In fact, I recently attended .conf19 in Las Vegas with over 11,000 Splunkers and it opened my eyes. Let me tell you why I’m so excited about the possibilities for customers working together with Dell Technologies and Splunk on their journey to AI.

Splunk is already one of the largest data aggregators in the industry. I personally evaluated and deployed Splunk back in 2007 at a large media and entertainment company where we were using approximately 50,000 render cores with data coming in from numerous machine logs, applications, data bases, schedulers, render farms and many other sources.  With this level of data consolidation there’s a need to apply machine learning for predictive learning and root cause analysis, a significant use case for Splunk.

Data growth continues unabated at an exponential rate due to the number of connected devices talking to the network. Everything today has sensors, with all of them transmitting data to some central repository for analysis along with other related data sets. With these and other IOT devices being so prevalent and decentralized we need globally distributed storage repositories to be able to receive this data.

Last year, Splunk announced availability of a new modern data path model called SmartStore. SmartStore allows you to decouple compute from storage and also supports the S3 protocol. You still have the hot/cache tier as tier 0 but now you can tier from it directly to a SmartStore target like Dell EMC’s Elastic Cloud Storage (ECS) which has been certified by Splunk. In addition, Dell EMC also released an ECS Splunk plugin freely available for customers who have deployed ECS and want to monitor it via Splunk. There are links below for information and downloads.

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Exam Prep, Dell EMC AI

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Exam Prep, Dell EMC AI

In the lead-up to Splunk .conf19 our friends from Big Data Beard  set out on an epic roadtrip from Boston to Las Vegas driving an RV loaded with IOT sensors, collecting data and storing it on ECS via SmartStore. With the power of the combined offerings of Dell Technologies, Splunk and other sponsors, the Big Data Beard team captured data, spotted trends, predicted failures, and discovered more exciting ways to use Splunk to extract value from machine generated data. It was a great real time and real-world use case that generated a lot of excitement at the event.

Dell Technologies had three sessions at Splunk .conf19 that are archived at the links below.

“Splunk Apps for Infrastructure from Dell EMC”. In this session, you’ll hear from an experienced Site Reliability Engineer how Dell EMC’s investment in Splunk apps across their platforms makes it easier for Splunk users to monitor infrastructure and integrate these insights into an overall application performance monitoring strategy.

“Cloud-scale On-Prem: SmartStore Best Practices with Dell EMC”. In this session learn how Dell EMC delivers cloud-scale object storage for on-premises deployments of SmartStore and how you can leverage best practices learned from some of the world’s largest global Splunk deployments.

“Secrets from a Splunk Ninja: Deployment Architecture Best Practices”. In this session learn more about the best practices that power some of the largest, on-premises deployments of Splunk around the world. Determine if SmartStore is right for your environment and size it for maximum performance and benefit.

Dell Technologies and Splunk offer a few ways to help customers tackle AI, ML, or DL challenges with prebuilt, pretested, preconfigured Ready Solutions or Reference Architectures, all with or without GPU’s.

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Exam Prep, Dell EMC AI

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Exam Prep, Dell EMC AI

As data growth continues the opportunities for Dell Technologies and Splunk customers expand. 

Source: dellemc.com

Thursday 13 February 2020

Protect the Power of the Cloud: Data Protection for VMware Cloud

As predictions roll in for trends in technology this year, it is no surprise that data protection, security and automation are rising in importance for organizations worldwide. As more organizations move applications and data to the cloud, solid and reliable recovery is becoming a differentiator. For our customers, Dell EMC is a trusted data protection partner, providing comprehensive coverage, rapid recovery and powerful architecture, resulting in reduced infrastructure costs and simply powerful data protection for VMware environments.

Dell EMC and VMware – Modernizing One Infrastructure at a Time


Dell EMC and VMware co-engineered the VMware Cloud on Dell EMC cloud infrastructure delivered as a service on-premises, fully managed by VMware. VMware Cloud on Dell EMC is built on Dell EMC VxRail infrastructure and runs on the VMware SDDC to handle compute, storage and network processing. This all-inclusive service also offers a hybrid control plane to provision and monitor resources as well as a monthly subscription-based pricing model.

VMware Cloud on Dell EMC offers customers support for multiple use cases, such as data center modernization, switch from CapEx to OpEx financial models, and hardware refresh. Another use case is data latency and sovereignty, where the infrastructure stays on-site and can be employed by companies with low data latency requirements, data sovereignty requirements, and data governance and security. Application modernization, including development agility, Kubernetes and traditional application deployments are all instances where customers can utilize VMware Cloud on Dell EMC.

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Study Materials, Dell EMC Cloud, Dell EMC Exam Prep

With DRaaS, ensure your data is safe and accessible if your infrastructure becomes unavailable.

Data Protection for VMware Cloud on AWS


Dell EMC also offers Data Protection for VMware Cloud on AWS users. Dell EMC Data Protection Software is cloud-enabled and offers comprehensive data protection and superior performance for backup and recovery across your entire VMware Cloud on AWS environment. It provides the same world-class data protection whether your applications run on-premises or in the public cloud.

Just as VMware Cloud on AWS enables vAdmins to manage cloud resources with familiar VMware tools, Dell EMC seamlessly integrates cloud and on-premises data protection. This allows users to utilize the same data protection tools in the cloud that they already use on-premises, without needing to upskill.

Dell EMC data protection for VMware Cloud allows for extremely simple disaster recovery for VMware Cloud users. Organizations can fail over on demand to VMware Cloud on AWS in the case of a disaster event and spin off and run VMs in their own VMware Cloud on AWS environment. When the disaster event is resolved, they simply use vMotion to move the VMs back to on-premises for recovery.

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Study Materials, Dell EMC Cloud, Dell EMC Exam Prep

Top Solution Benefits of Running Dell EMC Data Protection for VMware Cloud

Top Solution Benefits of Running Dell EMC Data Protection for VMware Cloud


◉ Proven enterprise data protection for the enterprise public cloud
◉ Seamless integration with on-premises data protection
◉ Industry’s best-in-class deduplication, which can lead to lower consumption costs
◉ Protection of VMware workloads on AWS storage for increased resiliency
◉ Native integration into VMware management tools for the ultimate automation experience

Dell EMC data protection supports four use cases today with our cloud data protection strategy: long-term retention, disaster recovery, backup to the cloud and hybrid/in-cloud backup.

Automate Everywhere for Simplified Management


PowerProtect Data Manager can automate protection policies that provides companies with an easy and seamless way to ensure their data is always protected. Specifically, infrastructure automation has attribute-based inclusion and exclusion, where IT admins can set rules to be used for VM discovery and inclusion, as well as flexible and powerful REGEX matching. It can be auto deployed by either SDDC deployer (auto deploy backup, target and join vSphere), auto deploy proxy data movers, and auto deploy in AWS or other marketplaces.

Data protection policies allow data protection administrators to control backup schedules and retention periods. Policies map to the underlying data protection provider from which the policy originated and adhere to the service level agreement (SLA) that the data protection provider supports.

Additionally, Dell EMC offers orchestration automation to move data directly from the VMs to the intended storage target, both on-premises and in the cloud. The Dell EMC data protection solution is software-based, comprising workflows that enhance the existing catalog with provisioning and job status checking tasks. Using vRealize Orchestrator, Dell EMC data protection solutions are easily integrated into the VM environment, ensuring simplicity and ease-of-use in a familiar UI environment.

Dell EMC Study Materials, Dell EMC Prep, Dell EMC Study Materials, Dell EMC Cloud, Dell EMC Exam Prep

Success Stories


Watermark – Watermark Solutions (Apptrix), an ERP firm for manufacturing companies, offers multiple levels of disaster recovery to the companies it hosts. Impeded by an eight-hour backup window and frequent failed backups, they switched to Dell EMC Data Protection. Their new IDPA DP4400 appliance was up and running in two days and reliably completes backups in under two hours. The DP4400 works seamlessly with VMware, with over 1,000 VMs, automatically adding new VMs to the backup.

Xavier University – Storing and protecting data for extended periods of time is a necessity for Xavier University. With a 95%  virtual environment, Xavier turned to Dell EMC Data Protection to protect vital student and administrative records. Backups of 20TB that ran 2-3 days with their former solution now complete in a single evening. Best of all, Xavier finds they’re saving space as well as time, thanks to the superior deduplication capabilities of Dell EMC Data Protection Software.

Westgate Resorts – Westgate Resorts, a hospitality timeshare company with 27 properties and over 10,000 employees, is growing – but their data protection solution wasn’t cost effective as they expanded. By switching to Dell EMC Data Protection, they’re able to grow while easily protecting, managing and monitoring their data. By using Data Domain Cloud Tier to Microsoft Azure, they’ve been able to utilize cloud services for a fraction of the cost of keeping old backups on their primary storage.

Dell EMC data protection provides customers with powerful, automated data protection for VMware and multi-cloud environments with deep integration points. Dell EMC and VMware are committed to delivering jointly engineered products, making it easier for our customers to protect and recover their IT investment now and into the future. Our solutions are simple to deploy and manage – on-premises and in the cloud.

Tuesday 11 February 2020

Designing Innovative Disaster Recovery for Leading Architecture School

One of the most compelling use cases for cloud data protection is disaster recovery (DR). DR operations are critical for every business, from startup to enterprise and from research to banking to retail. An inadequate disaster recovery plan or infrastructure can lead to, well, disaster…at the most critical moment.

While there are many effective DR architectures stemming from on-prem and co-located environments, the cloud offers a new dynamic in storage targets – one that can be safe, secure, economically viable and easily managed, with tools that an organization’s IT staff already knows.

Dell EMC data protection solutions include Cloud Disaster Recovery (Cloud DR), cloud-enabled disaster recovery infrastructure, that bolsters your business continuity operations and makes the most of your cloud investment, whether it be a hybrid cloud, public cloud platform (Amazon Web Services (AWS) and Microsoft Azure), or a multi-cloud environment. But the proof is in the pudding, as they say. Let’s look at an organization in Los Angeles which recently rolled out Dell EMC Cloud DR.

Southern California Institute of Architecture (SCI-Arc) is a top-ranked, award-winning institution that is shaping the great architectural minds of tomorrow and serves as an incubator for some of the most innovative thinking in architecture. SCI-Arc offers both graduate and undergraduate degrees and is home to over 500 students as well as 80 faculty members, most of whom are practicing architects.

SCI-Arc stores close to 600 TB of data, and one of the main challenges they are facing is the continuous growth of data, which is increasing 20 percent annually. Their prior back-up solution no longer met their needs as they missed their Service Level window. And with their data growth, this was only going to get worse. SCI-Arc’s primary export is the students’ revolutionary work, including animations and renderings that represent days of creativity and imagination by these students. This data must be kept safe and recoverable in any situation, because SCI-Arc needs a record of the files for accreditation and the students need their files for presentations, portfolios, and their careers. SCI-Arc chose to protect their data using Dell EMC Cloud DR hosted on AWS, ensuring that their data is kept safe and is recoverable at a moment’s notice.

Dell EMC Cloud DR provides SCI-Arc the ability to protect on-premises workloads by securely and efficiently copying image backup onto AWS S3 for disaster recovery. In fact, Dell EMC is so efficient that they back up 90 percent of SCI-Arc’s VMs in one hour and have slashed their backup windows from 36 hours down to three. In the event of a disaster, their cloud DR strategy built on Dell EMC Cloud DR offers rapid recovery and three-click failover, two-click failback, and the workloads can be run directly in AWS or recovered to VMware Cloud on AWS.

By deploying DR in this innovative manner, Dell EMC Cloud DR enables organizations to take advantage of the agility and cost effectiveness of cloud object storage. Resources in the cloud are spun up only when the primary data center isn’t available and decommissioned when they are no longer needed – a truly elastic environment. This is much more cost-effective than having hardware constantly up and running within the public cloud.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Prep

Maximum for the Minimum

Dell EMC Cloud DR requires a minimal footprint in AWS, as well as minimal compute cycles, resulting in a disaster recovery solution with minimal cost. Dell EMC was also able to help SCI-Arc maintain a smaller cloud footprint. Initially, SCI-Arc stored 570 TB of uncompressed data, after it is run through Dell EMC’s patented deduplication technology on-prem, and in the cloud, they are backing up about 10 TB of data – a 57:1 deduplication ratio. They also went from retaining six months’ worth of backups to retaining two years’ worth of cloud backups with ease, all while having little to zero network impact. Cloud DR has enabled SCI-Arc to become more cost-effective by eliminating the need for a secondary backup server.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Prep
Dell EMC was able to revolutionize backup in the cloud for SCI-Arc by eliminating a secondary DR, extending their cloud backup retention while reducing their overall cloud footprint. According to Peter Kekich, Network Systems Administrator at SCI-Arc, “Dell Technologies was our chosen provider because of their track record. Being able to recall all of these backups at a moment’s notice was crucial for us to be successful running our IT department at SCI-Arc.” In a creative environment such as SCI-Arc, Dell EMC data protection offers peace of mind and eliminates worries about whether student creativity and projects are getting backed up properly or if the data can be fully and promptly restored.

Learn more about SCI-Arc in this case study.

Source: Dellemc.com

Sunday 9 February 2020

Tailored Technology for Customization at the Edge

A perfectly tailored suit is an investment. It’s worth it to pay for the perfect fit, high-quality material appropriate for the occasion, and a color that makes your eyes pop.

Dell EMC Study Material, Dell EMC Guides, Dell EMC Certification, Dell EMC Prep, Dell EMC Edge, Dell EMC Learning

So why, when it comes to mission-critical technology solutions, are government agencies expected to buy off-the-rack?

As federal agencies expand nascent AI capabilities, deploy IoT technologies, and collect infinitely more data, their missions require a customized, nuanced approach to transform edge capabilities.

To combat the data deluge resulting from AI and IoT advances, the Federal Data Strategy’s first-year action plan was released in late December. It urges the launch of a federal CDO Council, establishment of a Federal Data Policy Committee, and identification of priority data assets for open data – all by the end of January 2020. These are just the first steps to prepare for what’s already underway; government’s mass migration to the edge and the resulting proliferation of data. In just five years, Gartner projects 75 percent of all enterprise-generated data will be processed outside of a traditional data center or cloud.

As we work to manage, analyze, and secure data collected at the edge, we need to evaluate the solutions with the same standards we apply in our data center or cloud. To enable insights at the edge, federal teams need the same (or better) function: high compute, speed, power, storage, security, but now in a durable, portable form. This may require equipment to tolerate a higher level of vibration, withstand extreme thermal ranges, fit precise dimensions, or incorporate specialized security requirements.

Partnering with Dell Technologies OEM | Embedded & Edge Solutions enables Federal SIs and agencies to integrate trusted Tier 1 infrastructure into solutions built for their specific mission requirements, or for those of their end users. For instance, working with our team, you might re-brand Dell Technologies hardware as part of your solution, leveraging specialized OEM-ready designs like our XR2 Rugged Server and Extended Life (XL) option. We also offer turnkey solutions designed by our customers and delivered through Dell Technologies, which allows us to further serve what we know are your very specific use cases.

As an example, our customer Tracewell Systems worked with Dell Technologies OEM | Embedded & Edge Solutions to customize the Dell EMC PowerEdge FX architecture, creating a family of products that meets the needs of their federal customer’s server sled field dimensions. Because Tracewell’s T-FX2 solution is still interoperable with standard Dell EMC server sleds, the end customer can now plug and play powerful Dell EMC compute and storage products from the field to the data center, cutting processing time from 14 to two days.

Feds at the edge need the right solution, and need that solution delivered quickly and securely. Agencies and federal systems integrators need a trusted partner that can help them compress time-to-market while ensuring regulatory compliance and providing a secure supply chain. While conducting a search for an OEM partner, agencies and systems integrators should consider vendors that will embrace challenges and engage in a deep, collaborative relationship. Moreover, dig beyond the design of the technology and ask:

◉ Does the vendor have the buying power to guarantee production consistency, so the product can continue to be delivered as designed? If necessary, consider looking for a partner that will guarantee a long-life solution.

◉ Are there lifecycle support services from problem identification, to customized design, to build and integration, to delivery, to experience?

◉ Can the potential partner supply program management to handle all regulation and compliance complications?

◉ Does the vendor have a broad portfolio for easy integration of solutions from edge to core to cloud?

◉ Does the vender have a deep focus on security – from the chip level through to delivery and support?

These critical aspects will help you design those faster, smaller, smarter solutions, and get them in the field more quickly.

With 900+ dedicated team members, the Dell Technologies OEM | Embedded & Edge Solutions group has embraced challenges for 20 years, creating more than 10,000 unique project designs.

Saturday 8 February 2020

Accelerating Storage Innovation in the Next Data Decade

Over the previous decade, technology transformed nearly every business into an IT-driven business. From farming to pharmaceuticals, these information technology developments have led organizations to reimagine how they operate, compete, and serve customers. Data is at the heart of these changes and will continue its transformative trajectory as organizations navigate the waves of technological progress in the next “Data Decade.”

Dell EMC Exam Prep, Dell EMC Prep, Dell EMC Study Material, Dell EMC Tutorial and Materials

In data storage – which touches every IT-driven business – the pace of innovation is accelerating, yet most enterprises continue to struggle with data’s explosive growth and velocity. Getting the highest use and value from their data is becoming ever more critical for organizations, especially for those with data stores reaching exabyte scale.

In order to have strategic value in the enterprise, storage innovation must cross the capabilities chasm from just storing and moving around bits to holistic data management.

In 2019, our Dell Technologies Storage CTO Council studied more than 90 key technologies and ranked which ones have the innovation potential to help storage cross that capabilities chasm in the next 5-10 years. This year, there are three key areas we believe will be difference-makers for organizations that are pushing the limits of current storage and IT approaches.

Let’s take a closer look.

Trend #1: Machine learning and CPU Performance unlock new storage and data management approaches


This year, we will see new approaches that solve streaming data challenges, including the use of container-based architectures and software-defined storage. There is a desire by customers in industries such as manufacturing, cybersecurity, autonomous vehicles, public safety and healthcare to build applications that treat data as streams instead of breaking it up into separate files or objects.

Ingesting and processing stream data has unique challenges that limit traditional IT and storage systems. Since streaming workloads often change throughout the day – storage capacity and compute power must be elastic to accommodate. This requires intelligence within the storage that can instantly provide autoscaling.

By treating everything as a data stream, event data can be replayed in the same way we watch a live sporting event on a DVR-enabled TV, where the program can be paused, rewound and replayed instantly. Until now, application developers have been limited in their ability to address use cases that can leverage data as streams for capture, playback and archive. Enabling these capabilities with data will make it easier to build applications that allow new use cases that were never thought of previously.

Dell EMC Exam Prep, Dell EMC Prep, Dell EMC Study Material, Dell EMC Tutorial and Materials
Dataset Management helps solve the data lifecycle problem

In the realm of data management, 2020 will usher in new approaches for organizations wishing to better manage the data that is distributed across many silos of on-prem and cloud data stores. Data growth has been outstripping the growth of IT budgets for years, making it difficult for organizations not only to keep and store all their data, but manage, monetize, secure and make it useful for end users.

Enter Dataset Management – an evolving discipline using various approaches and technologies to help organizations better use and manage data through its lifecycle. At its core, it is about the ability to store data transparently and make it easily discoverable. Our industry has been very good at storing block, file and object data, sometimes unifying these data in a data lake. Dataset Management is the evolution of a data lake, providing customers with the ability to instantly find the data they want and make it actionable in proper context across on-prem and cloud-based data stores.

Dataset Management will be especially useful for industries (i.e. media & entertainment, healthcare, insurance) that frequently have data stored across different storage systems and platforms (i.e. device/instrument generated raw data, to derivative data at a project level, etc.). Customers want the ability to search across these data stores to do things such as creating custom workflows. For instance, many of our largest media & entertainment customers are using Dataset Management to connect with asset management databases to tag datasets, which can then be moved to the correct datacenters for things such as special effects work or digital postprocessing, then to distribution and finally to archives.

Traditional methods for managing unstructured data only takes you so far. Because of new technological advancements like machine learning and higher CPU performance, we see Dataset Management growing further in prominence in 2020, as it offers organizations a bridge from the old world of directories and files to the new world of data and metadata.

Trend #2: Storage will be architected and consumed as Software-defined


We can expect to see new storage designs in 2020 that will further blur the line between storage and compute.

Some of our customers tell us they are looking for more flexibility in their traditional SANs, wishing to have compute as close to storage as possible to support data-centric workloads and to reduce operational complexity.

With deeper integration of virtualization technologies on the storage array, apps can be run directly on the same system and managed with standard tools. This could be suitable for data-centric applications that require very storage- and data-intensive operations (i.e. analytics apps, intense database apps). Also, workloads that require quick transactional latency and a lot of data.

This isn’t HCI in the classic sense, but rather about leveraging and interoperating with existing infrastructure and processes while also giving a greater degree of deployment flexibility to suit the customer’s specific environment and/or application. It could open up new use cases (i.e. AI ML/analytics at edge locations and/or private cloud, workload domains, etc.); it could also lead to lower cost of ownership and simplification for IT teams and application owners that don’t always have to rely on a storage admin to provision or manage the underlying storage.

Software-defined Infrastructure no longer just for hyper-scalers

Software-defined infrastructure (SDI) is also becoming a greater consideration in enterprise data centers to augment traditional SANs and HCI deployments. Long the realm of hyper-scalers, traditional enterprises are ready to adopt SDI for the redeployment of certain workloads that have different requirements for capacity and compute than what traditional 3-layer SANs can provide.

These are customers architecting for agility at scale and want the flexibility of rapidly scaling storage and compute independently of each other. It’s for the customer that needs to consolidate multiple high performance (e.g. database) or general workloads. As enterprises consider consolidation strategies, they will bump up against the limits of traditional SANs and the unpredictable performance/costs and lock-in of cloud services. This is where SDI becomes a very viable alternative to traditional SANs and HCI for certain workloads.

Trend #3: High-performance Object storage enters the mainstream


As Object moves from cheap and deep, cold storage or archive to a modern cloud-native storage platform, performance is on many people’s minds.

One of the reasons we see this trending upward this year is demand for it by application developers. Analytics is also driving a lot of demand and we expect to see companies in different verticals moving in this direction.

In turn, the added performance of flash and NVMe are creating tremendous opportunity for Object-based platforms to support things that require speed and near-limitless scale (i.e. analytics, Advanced Driver Assistance Systems (ADAS), IoT, cloud-native app development, etc.). Side note: historically, Object storage hasn’t been fast enough for ADAS workloads, but all-flash is changing that conversation.

Flash-based Object storage with automated tiering to disk offers a cost-effective solution, particularly when a customer is talking about hundreds of petabytes or exabyte-scale. It allows you to move the data you need up to the flash tier to run your analytics and high-performance applications and then move the data off to a cold or archive tier when you’re done with it.

As Object becomes tuned for flash and NVMe, we expect a higher level of interest in Object for things that have traditionally been stored on file-based NAS, such as images, log data, and machine generated data.

As the pace of technology innovation accelerates, so too will the possibilities in storage and data management. We are standing with our customers at the dawn of the “Data Decade.”