Saturday 29 June 2019

Enabling VDI in VMware Cloud on AWS with Dell EMC Unity Cloud Edition

Virtual Desktop Infrastructure (VDI) enables organizations to provide secure desktop environments for use by their employees. The use of VDI can be extended to non-employees for tasks requiring the use of corporate resources deemed necessary. VDI can run on mobile devices as well as the traditional desktop/laptop platforms and can be provisioned and de-provisioned securely and rapidly whether provided through on-premises datacenters or via cloud-hosted services.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certifications, Dell EMC Tutorials and Materials

VMware Horizon 7 enables organizations to deliver a scalable and automated solution for VDI deployments. The common management tools and uniform Horizon platform simplify extending the VDI environment across both cloud and on-premises deployments. While VDI deployments can be for either stateless/non-persistent desktops or stateful/persistent desktops, there is a need to retain and protect user data and possibly replicate this data to other sites. Virtual desktop deployments that replace a user’s typical desktop client environment with a VDI platform typically require support for persistent data including user home directories. Providing persistent file shares on-premises is something that is easy to provide using a platform like Dell EMC Unity storage, but when running in a public cloud, a solution like Dell EMC Unity Cloud Edition is needed.

Dell EMC Unity Cloud Edition, provides a ready-made solution for storing file data including user home directories, with a robust set of enterprise-level file services developed to meet the needs of large-scale multiuser and transactional file environments. The Unity filesystem allows for scalability, efficiency, and flexibility that provides for cloud and large-scale on-premises VDI environments.

Dell EMC Unity Cloud Edition can be easily deployed in a VMware Cloud to provide multi-protocol file access over SMB, NFS, FTP, and SFTP. There is a wide spectrum of feature sets supported in Dell EMC Unity Cloud Edition such as, but not limited to, pointer-based snapshots, NDMP backup, user and tree level quotas, file-level tiering to cloud object repositories based on user-defined policies, ability to seamlessly shrink and extend file system size without disrupting client access, and a Common Event Enabler that allows applications to scan for viruses and receive file event notifications for auditing, quota management, search/indexing, and more. Dell EMC Unity Cloud Edition supports replication with other physical and virtual instances of Dell EMC Unity systems across multiple sites.

Additionally, Dell EMC Unity Cloud Edition Unisphere provides a powerful unified management framework composed of an HTML5 graphical user interface, command line interface, and RESTful API allowing novice and experienced administrators alike to easily manage their file storage environments. Wizard-based file provisioning enables novice administrators to quickly get a file storage environment up and running. The CLI and RESTful API allow more seasoned administrators to create complex scripts to facilitate specific use cases, while still using the Unisphere GUI for daily provisioning and management tasks.

For a demonstration of the Unity Cloud Edition VDI user file shares solution in action, see the video available here.


The VDI use case described here is one of the opportunities for improving private and public cloud application deployments using the capabilities of Dell EMC Unity delivered by the software defined Unity Cloud Edition.

Thursday 27 June 2019

Cybera Makes a Strong Case for Accelerating Your IT Refresh Cycle

This is the story of how a well-established and respected managed network services provider decided on an aggressive, business-bolstering 3-year server refresh cycle.

Dell EMC Study Materials, Dell EMC Tutorial and Materials, Dell EMC Learning

We take it for granted that our experience at the gas station, grocery store, bank, or doctor’s office always offers fast, easy, and secure transactions, as well as readily available inventory and immediate updates to loyalty points. Ever wonder how merchants offer this seamless experience?

Chances are the merchant you visit depends on Cybera cloud-based managed network services for their applications, security, and PCI compliance. In turn, the Cybera services rely on PowerEdge servers for a competitive edge. Here’s a deeper look at the role PowerEdge plays in Cybera’s winning strategy, and how an accelerated IT refresh schedule helps keep their customers satisfied and their business at the forefront of technology.

Let’s start at the beginning…

Who is Cybera?


Cybera fundamentally changes the way highly distributed businesses use technology. The purpose-built cloud platform empowers customers to rapidly deploy, secure, and optimize new cloud-based applications and services, especially at the edge of the network (such as remote sites or individual IoT devices). This approach helps ensure security and performance while reducing the cost, disruption, and complexity of delivering new applications and services across globally distributed enterprises.

With Cybera, companies can operate with greater agility across their entire application ecosystems—leveraging technology that frees them up to focus on what they do best. The approach is ideal for loyalty programs, omni-channel strategies, compliance, IoT deployment, payment processing, and back-office applications across hundreds or even thousands of remote sites.

Cybera delivers these cloud-based services using a software-defined solution running on Dell EMC PowerEdge servers and VMware vSphere. To keep up with customer and market demands, Cybera has adopted an aggressive 3-year refresh cycle to deliver the advanced levels of performance and agility their customers have come to expect.

Why Dell EMC PowerEdge


We have a wave, and we’ve got to stay in front of it … A 5-year depreciation cycle isn’t going to work. A more aggressive 3-year refresh is best for our cloud strategy. Customers have growing challenges and pain points that they need to have addressed. PowerEdge servers sit at the heart of our cloud services.

– Troy Crabtree, Cybera Executive Vice President of Operations

Cybera recognized a growing need to change their infrastructure strategy to support their customers’ rapidly expanding business requirements. This meant transitioning to become a more nimble and agile enterprise backed by a software-defined data center. Choosing PowerEdge servers to upgrade their cloud-based platform, Cybera accelerated their server refresh cycles to support the shift from a static and inflexible architecture to one that is increasingly scalable, automated, and secure. The new architecture is the next step in Cybera’s evolution to deliver IT as a service to customers across the globe in markets as diverse as petroleum, hospitality, and healthcare.

PowerEdge servers have helped the company maximize performance and scalability without sacrificing security, helping them to reduce unplanned downtime by as much as 40% and decrease application errors by 30%. Those capabilities are already paying dividends since Cybera runs a virtualized environment with several thousand virtual machines using VMware vSphere. In particular, vSphere enables rapid provisioning of apps at remote locations and provides enhanced visibility across multiple apps to simplify compliance.

Dell Technologies has been a trusted partner of Cybera for more than a decade. For the recent accelerated server refresh, the company completed a thorough analysis before choosing the latest generation of PowerEdge servers and VMware.

Read More PDF File: Cybera Accelerates Server refresh to stay-ahead of customer demands

Tuesday 25 June 2019

Data Management: Design Principals

In the recent blog “Redefining Data Protection,” Sharad Rastogi discusses unlocking the value of data capital by evolving data protection to that of a leverageable service to help drive business outcomes. In this context, data protection transitions to data management.

Sharad describes key aspects of a data management solution, including use of any data source, target, service level objective, location, use case, consumption model and business model. With those considerations in mind, the next question is, what are the design principles for a data management solution?

At the highest level there are two primary design principles: a Software-Defined Platform and Multi-Dimensional Appliances. Let’s unpack both.

Software Defined Platform


In order to support the wide range of capabilities required, a robust data management solution requires a significant amount of flexibility. The most efficient method of delivering that flexibility is through a software-defined, API-based platform. Below are some of the core tenets of the software-defined platform:

◈ Form factor: While the software-defined data platform can be delivered as an integrated appliance, the same capabilities can be obtained in a software-only form factor, installed on a software-defined hardware platform on-premises or in the cloud.

◈ Data services: Software-defined applies beyond form factor — it also pertains to the platform’s ability to provide flexible data services. A single software-defined platform provides the full suite of data protection capabilities from archive, long-term retention and backup / recovery to disaster recovery and business continuity spanning the entire RTO / RPO spectrum.

◈ Cyber recovery: It supports the ability to recover your data on-premises or in the cloud, and the capability to recover data in the event of a ransomware attack by providing secure air-gapped solutions.

◈ Efficiency, security and integrity: The solution should support data reduction techniques such as compression and deduplication while ensuring the safety of the data through encryption and its integrity through a data invulnerability architecture.

◈ Data management: The software-defined platform supports a wide range of data management use cases from fast, lightweight copies for dev / test, analytics, etc. to various compliance use cases, including HIPAA, SEC and GDPR.

◈ Flexible architecture: The solution is architected using a flexible and scalable modern, services-based platform, enabling support for a full spectrum of workloads ranging from traditional enterprise applications to modern cloud native applications.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Tutorials and Materials
Data Protection Evolution

◈ Access methods: The platform supports a variety of access methods, including full data restore, application-directed recovery, and API access for third-party integrations. The API architecture enables the full power of the platform through published, stable and well-documented APIs.
◈ Consumption methods: It provides the ability to consume capabilities either as a platform managed by the end user or as a SaaS offering, which is managed by the provider.
◈ Automation: The platform embeds and leverages artificial intelligence and/or machine learning techniques to automate commonly executed workflows to place data on the correct tier and media type, detect and mitigate system and security issues, provide access through NLP channels, etc.

Multi-Dimensional Appliances


The second foundational tenet required to complement a software-defined platform is multi-dimensional appliances, and those core elements are:

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Tutorials and Materials

◈ Scale: The appliance must have the ability to scale in place, and to scale up and scale out, while starting with a small size and adding additional capacity either through more disks or flash drives or enabled through licensing in the same form factor. It can scale up by adding more disk or flash trays behind an existing controller. And, it can scale out by adding additional appliance capacity units.

◈ Media: The type of storage can be traditional spinning disk media, all flash, or emerging media such as Non-Volatile Memory express (NVMe) and next-generation Storage Class Memory (SCM). A traditional backup storage scenario may leverage all HDDs, possibly complemented with small amount of flash. Alternatively, the high-performance of all-flash media may be optimal for dev/test and analytics use cases.

◈ Deployment: The same appliance configuration can be deployed on-premises in an integrated form factor, or in a software-only form factor writing to commodity protection storage. It can also be deployed in the cloud as a software-only appliance writing to object storage or offered as SaaS by a service provider.

◈ Use cases: The appliance is designed to support a full range of software-defined platform use cases, ranging from traditional, capacity-oriented use cases (archive, long-term retention, backup, restore) to performance-oriented use cases (replication, disaster recovery, dev/test, analytics).

◈ Security and integrity: The appliance supports security capabilities such as encryption in place and in flight plus key management. It also supports data resilience and integrity through the data invulnerability architecture.

◈ Management: The multi-dimensional appliance can be managed through traditional on-board system management techniques. It can also be managed through a SaaS-based management portal that can manage large, multi-site environments. Additionally, the platform provides rich APIs for third-party integrations and custom, end user workflows.

◈ Resiliency: The appliance auto-discovers component and system failures, as well as security intrusions and anomalies. In addition to alerting the administrator, it attempts to remediate the fault through a self-healing architecture or block out suspicious activity and data sets.

◈ High availability and non-disruptive operations: The system provides high availability and non-disruptive operations (NDO) through component-level redundancy and heuristic-based predictive software that proactively discovers, isolates and remediates failure. The system provides the ability to upgrade different software and firmware in the system non-disruptively and with minimal operator intervention

◈ Search and Analytics: The multi-dimensional appliance provides rich search and analytics functionality. It provides predictive search capabilities at the VM level, files within the VM and even content within those files. It provides detailed analytics on the nature of the stored data from the type of files, to their age, to the sensitivity of the content.

◈ Efficiency: Efficiency is applied in the form of data reduction techniques such as deduplication and compression, which consequently reduces bandwidth use when sent over the wire. It is also cloud aware – so, for example, when searching a data set stored in the cloud it only displays the catalog and only selectively downloads the files needed to reduce cloud egress costs.

◈ Performance: The appliance supports a wide range of performance characteristics in support of a broad range of RPOs and change rates. It supports a sufficient number of ingest streams and ingest rates even to support zero RPO (i.e. no data loss) on a rapidly changing workload.

Hear more about Dell EMC’s perspective regarding data management in a CUBE Conversation with Beth Phalen (President, Data Protection Division) and Sharad Rastogi (SVP, Product Management) during Dell Technologies World 2019.


Beth Phalen and Sharad Rastogi share the Dell EMC perspective at theCUBE in Las Vegas

Modernizing IT should be a priority for all organizations as data continues to power the future of business. Effectively and efficiently protecting AND managing that data to drive business outcomes may determine who wins and who loses in the race toward data insights. The design principles above provide guidance toward identifying the appropriate data management solution attributes for your company.

Saturday 22 June 2019

Formulating a Digital Rescue Plan

The Fourth Industrial Revolution is here. Think of this as the data era; a time of deep connectivity driven by technology and powered by unprecedented amounts of data.  We are at a pivotal point in human history where the possibilities are endless, and the implications are huge.

The first three industrial revolutions were also pivotal points in human history. Each unique, but all common in their impact on life as we knew it at the time. Each period introduced new ways to harness energy to drive improvements. These improvements fundamentally changed and created new industries and economic models with the introduction of factories over agriculture, in the first revolution; mass production, communication, and transportation in the second; and automation and the rise of electronics in the third. With each industrial revolution, we created more efficient ways to work, in-turn improving productivity and driving growth rates.

The Fourth Industrial Revolution, however, is about data and the digitalization of everything from cities and factories to homes and cars. This interconnectedness, together with emerging technologies such as 5G, AI, AR/VR, and IoT, are blurring the lines between our physical and digital worlds and driving new emphasis on the customer experience.

Many businesses today are already feeling the impact of this digitalization and the challenges associated with storing, analyzing, activating, and securing the resulting data.  And as many businesses are just beginning to formulate a plan to realize the benefits of their data, the amount of information they are managing is continuing to grow at a rate that is only now starting to be realized.

◈ From the Dell Technologies Global Data Protection Index Study, we learned that on average organizations managed 9.7PB of data in 2018, compared to the 1.45PB managed in 2016. That’s an explosive growth of 569%!
◈ In a recent whitepaper, IDC predicts that the Global Datasphere will grow from 33 Zettabytes of data to 175 Zettabytes by 2025.
◈ The Dell Technologies Digital Transformation Index Study states ‘information overload’ is rapidly climbing the top 10 list of barriers businesses are facing when trying to digitally transform.

This explosion of data is causing rippling effects in every industry. From the cloud to the edge, business leaders are being challenged to push the boundaries of status quo.

The Digital Rescue Plan


If you are finding yourself on the wrong side of this growth rate, then you’re not alone. From the Digital Transformation Index Study, we also learned that despite the undeniable move towards a digital world, 91% of businesses’ digital transformation programs are still in their infancy. But our study also shows these businesses are beginning to formulate a digital rescue plan to increase productivity, profitability, revenue growth, customer retention, and ROI.  And, within 5 years 77% of business leaders believe they will be able to harness emerging technologies to predict customer demands and manage resources, while 68% intend to use emerging technologies to improve supply chain efficiency.

What I find interesting about this study is the shift in technology investments businesses are making in the next 1-3 years, as compared to the initial study in 2016.  For example, in 2016, cybersecurity, quantum computing, VR/AR, and cognitive systems investments weren’t even on the top 10 list. In 2018, cybersecurity, IoT technology, multi-cloud environment and AI top the list, with cybersecurity jumping to the number one spot.  This shift in technology investments indicates that businesses are thinking more about securing their data and emerging technologies than just two years ago.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Tutorials and Materials
Digital Transformation Index Study Top Technology Investments 2016 vs 2018

And while data security is critical, businesses must also consider data creation. More applications are running at the edge than ever before. With IoT, smart cars, and smart cities, data generation at the edge is on the verge of exploding. In a recent Gartner article titled What Edge Computing Means for Infrastructure and Operations Leaders “around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2025, Gartner predicts this figure will reach 75%”.  Bottom line, the edge is going to play a critical role in the future of data creation.

In the Fourth Industrial Revolution, a comprehensive digital rescue plan must include modern infrastructure to support emerging technologies and the growth in data businesses must manage, an edge-to-cloud data strategy, and the tools and education your workforce will need to help your business succeed.

Transformation in Action


If the first three industrial revolutions have taught us anything, it’s that life as we know it will change, jobs will be replaced by technology, new jobs will be born, and we will be able to work faster and more effortlessly. But the road to get there is not simple. Digital transformation is a complex process that takes time.

I encourage you to read the full Digital Transformation Index Study.  Also, be on the lookout for my upcoming blogs where I will feature customers who have gone through their own digital transformation journey, leveraging emerging technologies. They’ll share valuable insights and advice for you to leverage in your own transformation journey.

There’s no question, the data era is here. The question is, will you be ready?

Thursday 20 June 2019

Be a Smart Port City of Call!

Dell EMC Study Materials, Dell EMC Certifications, Dell EMC Tutorials and Materials
Port of Rotterdam Authority, Photographer: Eric Bakker

Over the years, ports have constantly evolved and embraced innovation to stay relevant. Just think about the advent of the bridge crane and shipping containers, which radically transformed how materials were shipped and handled.

Once again, the winds of change are blowing. Homogenous competition among ports is forcing operators to think about value innovation. With Brexit coming down the tracks and the US negotiating new global trade agreements, things look set to become even more complex. The current business model of seeking competitive advantage and profitable growth by focusing on loading and unloading services is no longer sustainable.

Port transformation initiatives


Competition aside, ports also must manage environmental protection requirements, security concerns (both cyber and physical) data protection and visibility/traceability of client assets. As a result, all major ports have begun to explore transformation initiatives to differentiate their offering and gain competitive advantage.

IoT connectivity solutions


We’re already seeing some of the top players implement IoT solutions to deliver traffic management systems, automation and digital invoicing (customs). For example, in Finland, port operator, Steveco in the Port of HaminaKotka is using a private LTE network provided by Nokia and Ukkoverkot and running on Dell Edge Gateways.

This dedicated low-latency network enables wirelessly connected cameras on cranes to provide real-time video streaming and analytics, as well as business-critical connectivity for trucks, sensors and workers. Through the new connectivity, the port operator has seen improvements across multiple areas, from improved situational awareness of container handling to warehouse logistics and port security. Read more about the solution here.

Maintenance solutions


Another customer’s solution analyzes voyage data, equipment data, and network structures to optimize overall operational performance for ship owners, partners and customers. Increasingly, we’re also seeing condition-based maintenance systems – equipment monitoring in real time to remotely service products via the Internet, so parts can be swapped out at the next port of call in order to reduce the ship’s operational down-time and avoid costly delays.

Other examples include 3D printing and using Augmented Reality to intuitively guide ship repairs by technicians without expert knowledge of the individual ship’s system.

Smart ships with automatic berthing


Along with reduced workload in the engine room due to cleaner fuels, I believe that all these advances will allow further reductions in minimum crew sizes. As a result, ports can expect to see smart ships that can be managed and maintained from a central base of operations. Think automatic collision avoidance, automatic berthing, a self-monitoring hull, engine and cargo; the ability to sail autonomously for a limited time in certain conditions and even no-crew drones for specific applications, such as short-distance ferries, tugs and fireboats.

Semi-autonomous barges


The city of Delft in the Netherlands is a great example. Here, Dell Technologies OEM & IoT and Nokia have teamed up on a digital city project to deliver goods, using semi-autonomous, hydrogen-powered barges on existing canals for ‘last mile’ transportation. The project uses world-class technologies from Dell and Nokia for compute, storage, data management, connectivity, analytics, IoT and blockchain. Testing will progress through 2019 with goal of becoming fully operational by the end of the year.

The good news is that this will help reduce city center truck congestion and carbon emissions.

Intelligent planning


I also envisage that intelligent planning based on AI-assisted data analytics, will manage complex variables such as optimal sailing routes, vessel speed and fuel consumption, upcoming weather fronts and on-board weight distribution. Of course, from a staffing perspective, all of these developments will demand new skills and training.

Meanwhile, operational decisions at the port will also be data driven. AIS (Automatic Identification System) is a great example. This is a satellite-based data exchange allows the tracking of virtually all cargo ships for vessel routing, considering the weather, traffic situation and port capacities along the route.

Becoming a smart port?


The next big question is how, do you become a smart port? What do you invest in? How is it defined?

In my view, piecemeal IoT projects – while positive – will not make your port smart. Rather, systemic demands a major mind-set change. Ports need to look outside, focus on global security risks rather than local interests, and think about the efficiency and sustainability of the entire shipping logistics value chain.

Big picture view


A supermarket cold chain case study is a classic example of how end-to end business collaboration for a smart port city might inter-connect and work. This starts with the shipper of the goods in the country of origin. Say, a banana producer moves to smart reefers and container tracking delivered by a shipping company, through to the arrival of the goods in the port of destination (customs clearance), to the hoisting of the container using smart cranes, to the transfer of the container to an IoT-tracked truck, to end delivery to the stock room in the supermarket, to bananas on the shelves and finally on your favorite breakfast cereal.

I believe that to become a smart port, all stakeholders need to take this big picture, banana-from-the-tree-to-the-cereal view. It’s no longer enough to just deliver your piece in the value chain. Every single piece must fit together like a jig saw puzzle and work together in harmony.

Collaboration is the name of the game


It’s wonderful to see that some great, joined-up initiatives are already taking place. At Hamburg, for example, connected-port initiatives are helping to double capacity— but not space — by 2025, simultaneously reducing operating costs for operators and logistics costs for cargo owners.

Meanwhile, Rotterdam’s interconnected information hub, the Portbase Port Community System, offers a one-stop shop for logistics and information exchange that addresses the needs of all stakeholders, from port customers through terminal operators and service providers. Rotterdam even uses 3D printing technologies to support the maintenance and repair of parts and accessories.

Put your port on the map


In summary, I believe that ports now have a unique opportunity to reposition themselves, redefine their business model and become smart and connected. According to McKinsey, if policy makers and businesses get it right, linking the physical and digital worlds could generate up to $11.1 trillion a year in economic value by 2025. However, McKinsey makes the valid point that you need to know where and how to invest.

Collaborating with a tier 1 technology partner with the right expertise and track record is the right navigation route. Our maritime business was set up to support customers like you reinvent their business models.

Tuesday 18 June 2019

Meet Deep Learning with Intel – The New Addition to the Dell EMC Ready Solutions for AI Portfolio

The new Dell EMC Ready Solutions for AI – Deep Learning with Intel accelerates AI insights, optimizes TCO and offers a fast on-ramp for deep learning workloads.


In a quest to bring the promise of artificial intelligence to life and capitalize on the massive amounts of data generated on a 24×7 basis, many organizations are rushing to pull together the different technology elements needed to power deep learning workloads. Today, this quest just got a lot easier with the launch of a new Ready Solutions for AI based on Dell EMC and Intel innovation.

Dell EMC Study Materials, Dell EMC Learning, Dell EMC Certifications, Dell EMC Guides

The Deep Learning with Intel solution joins the growing portfolio of Dell EMC Ready Solutions for AI and was unveiled today at International Super Computing in Frankfurt. This integrated hardware and software solution is powered by Dell EMC PowerEdge servers, Dell EMC PowerSwitch networking, and scale-out Isilon NAS storage and leverages the newest AI capabilities of Intel’s 2nd Generation Intel® Xeon® Scalable processor microarchitecture, Nauta open source software and includes enterprise support. The solution empowers organizations to deliver on the combined needs of their data science and IT teams and leverages deep learning to fuel their competitiveness.

Dell Technologies Consulting Services help customers implement and operationalize Ready Solution technologies and AI libraries, and scale their data engineering and data science capabilities. Once deployed, ProSupport experts provide comprehensive hardware and collaborative software support to help ensure optimal system performance and minimize downtime. Additionally, Education Services offers courses and certifications on data science, advanced analytics and more.

AI simplified


The new Deep Learning with Intel solution simplifies the path to AI-powered applications with the fully featured container-based Nauta deep learning platform which offers an innovative template pack approach that eliminates the need for data scientists to learn the intricacies of Kubernetes. In addition, Dell EMC’s data scientists have built use case examples for image recognition, natural language processing and recommendation engines to help customers understand the capabilities of the solution’s architecture.

Deep Learning with Intel is also pre-configured with the TensorFlow distributed deep learning framework, Horovod and all the requisite libraries for data modeling. This simplified path to productivity is both easy to setup and easy to use and empowers your data scientists to spend their time building models that generate value instead of wrangling with IT infrastructure.

Faster, deeper AI insights


Once the Deep Learning with Intel solution is up and running, you’re positioned to accelerate model training and testing with the power of 2nd Gen Intel® Xeon® scalable processors. This next-generation processor is at the heart of Dell EMC PowerEdge C6420 servers used in the Deep Learning with Intel solution, and together with the newest software optimizations for TensorFlow and supporting libraries, model training time is greatly reduced. The processor also includes new Vector Neural Network Instructions (VNNI) that radically speeds up deep learning inference workloads with more efficient 8-bit integer data formats and instructions to power through four times as much data as was possible with 32-bit single precision floating point methods.

The solution is integrated with the multi-user open-source Nauta software platform that enables containerized training workloads which ran up to 18% faster than the same workloads on a bare metal system. As your organization needs grow, the Deep Learning with Intel solution enables near-linear scaling, achieving 80% of theoretical maximum performance when the number of compute nodes is scaled from one to 16.

Enhanced TCO


Finally, the new Ready Solutions for AI is great from a total cost of ownership perspective, especially when compared to cloud and hardware-accelerated solutions. For deep learning training workloads as an example, the three-year TCO is 24% less on Deep Learning for Intel relative to a leading public cloud service, while providing double the compute time (24 hours per day versus 12 hours per day), and ten times the storage capacity (100TB versus 10TB).

Public cloud AI service costs can vary widely, and monthly charges can be surprisingly high when inadvertent mistakes lead to runaway processes that consume excessive CPU time or generate massive volumes of data. On the other hand, the Dell EMC Deep Learning with Intel on-premises solution provides managers and financial accountants with known and predictable expenses, while enabling your organization to drive standardization and control over your infrastructure and data.

Key takeaways


The new Dell EMC Ready Solutions for AI – Deep Learning with Intel is an ideal choice for organizations looking to leverage container-based environments to run both single node and distributed deep learning training and inferencing workloads. It simplifies the path to productivity for data science teams and IT and delivers better-than-bare-metal performance. And like all Dell EMC Ready Solutions, this solution is based on a linearly scalable building‑block approach so your deep learning environment can grow to meet your changing needs as time goes on.

Here’s the bottom line: With the included software, servers, networking, storage and services, all optimized for AI workloads, the Deep Learning with Intel solution gives you just about everything you need for an AI-powered organization. Just add data and stir.

Monday 17 June 2019

Taking the Fear Factor Out of AI

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certifications

For decades, films like Space Odyssey, War Games, Terminator and The Matrix have depicted the future and what it would be like if artificial intelligence (AI) took over the world. Fast forward to 2019 and AI is quickly becoming a reality. The things we only used to see in the movies are improving our daily lives and we often don’t realize it.

We’ve been living with AI assistance for quite some time. We use Waze and Google Maps to help us predict traffic patterns and find the shortest driving routes. We let Roomba navigate our homes and keep our floors clean. We trust flight operators to use auto-pilot while in the air, so they rarely focus on anything other than takeoffs and landings.  Even our data centers are getting smarter with learning technologies that automate workload sharing, data tiering and data movement.   All these functions require AI and are providing us positive experiences. And, we are accepting them into our lives at such a rapid pace, we now are beginning to expect this level of assisted intelligence from the products and services with which we interact.

On the flip side, there are many new, broader, more fully autonomous AI applications that really get at the heart of what the sci-fi community has exploited to the point they give us the creeps. Think robot wars, big brother mass surveillance, or the extinction of the human race.  It’s human nature to fear the unknown and the fact that technology fast-tracks innovation faster than the pace that society can change continually opens technology like deep learning up to the fear mongering.  But, I recently learned first-hand that it doesn’t have to be that way with AI and that things first seen as scary or weird can quickly evolve as you see and realize the value they can bring. Once you experience value, that thing becomes normal, and like a drug you want more of it. At that point, is where there will be an obvious separation of products and services I use; those that have fully embraced the latest technology to pivot their offering (think Tesla, AirBNB, Lyft) and those that are racing to catch-up.

I recently had the opportunity to interact with Sophia the Robot – the now famous AI-powered robot known for her human-like appearance and behavior.  Using AI, visual data processing and facial recognition, Sophia can imitate human gestures and facial expressions, answer certain questions and make simple conversations on topics she has been trained on. As is the norm with AI, she has been designed to get smarter over time and gain social skills that help her interact with humans, much like other humans would.

When I first ‘met’ Sophia, it was awkward. I couldn’t stop staring at her.  But, as we conversed, and I asked her more questions, I was surprised at how quickly I adapted to her being part of our environment. In less than 24-hours, anything I had felt creepy about when first interacting with Sophia, was gone. I was referring to her as a person, making jokes with her, and conversing with her, as if it was normal. And, it was.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certifications

My conversation with Sophia the Robot just a few hours after meeting her

My point being, AI is not future looking, it is already a big part of our lives.  As I learn more about the power of AI, I also want to help you, our customers, gain a better understanding of how important AI will be to your businesses. I know that by experiencing advanced AI firsthand, like I have, you will gain new perspectives on what’s possible when you turn creepy into cool to help humanity and sustain a competitive differentiation in your business.

Most recently Dell EMC been working with AI thought leaders to demystify AI with our Magic of AI series meant to showcase the ‘Art of the Possible’ with the latest machine learning and deep learning techniques.  This series uses first-hand experiences with advanced AI as your muse to help spark ideas about how techniques like video analytics, image detection, and natural language processing can be applied to your industry.  For those of you who weren’t able to join us for the inaugural event in NYC with Sophia the Robot, I’m happy to be able to share some of the digital highlights from the experience. You can watch my video interview above with Sophia or check out the highlight reel from the main event from the GMA studios in Times Square.

Sunday 16 June 2019

Dell EMC Doubles Down on VxBlock at Cisco Live: Introducing NVMe Innovations, VxBlock Central 2.0

Earlier this Spring, Dell EMC reaffirmed its decade-long commitment to converged infrastructure (CI) through the multi-year extension of its longstanding systems integrator agreement with Cisco.

At the heart of our CI strategy is the VxBlock 1000, a system that delivers a true mission critical-foundation for the hybrid cloud and helps customers achieve greater simplicity and efficiency.

Dell EMC Study Materials, Dell EMC Certifications, Dell EMC Guides, Dell EMC Learning

This year at Cisco Live, Dell EMC is excited to make several announcements that deepen VxBlock 1000 integration across servers, networking, storage and data protection. Together, these announcements represent the next key milestone in our commitment to CI innovation and our customers —backed by our strong relationship with Cisco.

Here’s a look at what we’re announcing today:

Realizing the Power and Performance of NVMe Over Fabrics


NVMe is key to unlocking the next level of cloud operations on CI, but the full business benefit of NVMe can only be realized with an end-to-end infrastructure enabled by NVMe over Fabrics (NVMe-oF).

To help customers realize the full power of NVMe-oF, Dell EMC is announcing new integrated Cisco compute (UCS) and storage (MDS) 32G options, extending PowerMax capabilities to deliver game-changing NVMe performance across the VxBlock stack. This further enhances the powerful architecture, consistent high performance, availability and scalability that’s become synonymous with the VxBlock, helping you to meet the most demanding requirements of high-value, mission-critical workloads.

Now, customers can benefit from extreme end-to-end system performance with one system that can evolve from today’s millisecond to tomorrow’s microsecond latency.

These new compute and storage options will be available to order later this month.

Extending Integrated Data Protection to the Cloud


Dell EMC developed the concept of integrated data protection to help customers protect different tiers of applications and data efficiently and cost effectively — and with precisely the right level of protection for each business need.

While legacy data protection “bolted-on” to a new converged system might work, it may not provide the right level of protection for each service-level need. That’s why Dell EMC offers a flexible family of options for streamlined backup and recovery, data replication, business continuity, and workload mobility to deliver reliable, predictable, and cost-effective availability for Dell EMC converged infrastructure.

As of today, we’re extending our trusted, factory-integrated on-premise integrated protection solutions for VxBlock to hybrid and multi-cloud environments, including AWS. This release, which will be available to order in July, features options to help protect VMware workloads and data using new cost-effective Data Domain Virtual Edition and Cloud Disaster Recovery software options.

Simplifying Cloud Operations with VxBlock Central 2.0


Since its introduction in November 2018, Dell EMC VxBlock Central software has helped customers simplify CI administration through converged awareness, automation and analytics.

Today, we’re proud to announce VxBlock Central 2.0. Available this July, VxBlock Central 2.0. features new modular licensing that matches workflow automation, advanced analytics and life-cycle management/upgrade options to your needs.

VxBlock Central 2.0. licensing options include:
  • Base – Free with purchase of a VxBlock, the base license allows you to manage your system and improve compliance with inventory reporting and alerting.
  • Workflow Automation – Provision infrastructure on-demand using engineered workflows through vRealize Orchestrator. New workflows available with this package include Cisco UCS server expansion with Unity and XtremIO storage arrays.
  • Advanced Analytics – View capacity and KPIs to discover deeper actionable insights through vRealize Operations.
  • Lifecycle Management (new, available later in 2019) – Apply “guided path” software upgrades to optimize system performance.
    • Lifecycle Management includes a new multi-tenant, cloud-based database based on Cloud IQ that will collect and store the CI component inventory structured by the customer, extending the value and ease of use of the cloud-based analytics monitoring.
    • This feature extends the value and ease of use of the cloud-based analytics monitoring Cloud IQ already provides for individual Dell EMC storage arrays.

A Decade of Innovation Continues


Dell Technologies is #1 in IDC’s Certified Reference Systems & Integrated Infrastructure Tracker with a 47.8% share —1.5X that of the next vendor.* If you’re attending Cisco Live US in San Diego, June 9-13, stop by to talk with Dell EMC to see that innovation first-hand.

Saturday 15 June 2019

Finding the Sweet Spot When It Comes to Your Server Refresh Cycle

Nothing lasts forever. Despite the rumors, even Twinkies have a limited shelf life.

Dell EMC Study Materials, Dell EMC Learning, Dell EMC Tutorials and Materials

Which is why the server refresh cycle is so important for organizations today. Servers don’t last forever, and waiting too long to replace can result in downtime and put your core business functions at risk. But on the flip side, if you refresh too soon and for the wrong reasons, it could be a costly decision that eats up most of your IT budget.

So How Do You Find That Server Refresh “Sweet Spot”?


When it comes to server refresh, there are plenty of factors to consider. Cost, frequently run applications, IT staff, current infrastructure, growth objectives, and your plans for emerging workloads all come into play. Unfortunately, with a server refresh, there is no magical, one-size-fits-all answer. The best time to refresh your servers is based on your organization’s unique needs and long-term goals. There are obvious costs associated with modernizing your on-premise infrastructure. But there are also substantial costs to NOT doing it. By continuing to run legacy hardware, you could be putting your organization at risk.

In the past, the average server refresh cycle was about 5 years. But that timeline has shifted. Today, it’s not uncommon for businesses to refresh on a 3-year cycle to keep up with modern technology. These companies aren’t just refreshing for the fun of it (although we agree that new servers and data center toys ARE exciting) – they’re doing so to meet increasing demands and strategically position themselves to handle new innovations of the future. They know they need to modernize to remain competitive and prepare for the new technologies.

Benefits of a Server Refresh


Modern servers are made specifically to handle emerging workloads. For example, the PowerEdge MX7000 features a Dell EMC kinetic infrastructure, which means that shared pools of disaggregated compute, storage, and fabric resources can be configured – and then reconfigured – to specific workload needs and requirements.

Dell EMC Study Materials, Dell EMC Learning, Dell EMC Tutorials and Materials

In addition to handling data-intense workloads, replacing servers and other critical hardware reduces downtime and greatly reduces the risk of server failure. Improved reliability means that your IT staff spends less time on routine maintenance, freeing them up to focus on things that add value to the business.

Additionally, newer servers provide greater flexibility and give you the opportunity to scale as needed based on changing demands. Some workloads, especially mission-critical applications, are best run on-premises, and a modernized infrastructure makes it easier to adapt and deploy new applications. A recent study by Forrester found that Modernized firms are more than twice as likely as Aging firms to cite faster application updates and improved infrastructure scalability.

Modernized servers also enable you to virtualize. By layering software capabilities over hardware, you can create a data center where all the hardware is virtualized and controlled through software. This helps improve traditional server utilization (which is typically less than 15% of capacity without virtualization).

A server refresh presents a tremendous opportunity to improve your IT capabilities. New servers help you to remain competitive and position you for future data growth, innovative technologies, and demanding workloads that require systems integration.

Thursday 13 June 2019

That’s Entertainment, Folks!

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Tutorials and Materials

Next time you settle down on the couch to watch a Netflix movie, give a passing thought to the technology at the backend powering your entertainment. We talk about how technology has transformed the way we do business, but I think the changes have been equally radical on the home front. Just ten years ago, we all had cable or terrestrial TV and cell phones were there to make calls while on the move. To quote W.B Yeats’ famous lines, ‘All has changed, changed utterly.’

Streaming TV content


Today, we’re viewing on-demand, streaming TV services, like Netflix, using high definition TV displays. According to a recent Deloitte study, 55 percent of US households now subscribe to paid streaming video services, and nearly half of all US consumers streamed TV content every day or weekly in 2017. Not only are consumers across all age groups streaming more content than ever before—they are doing so on smartphones and tablets. Of course, for every change, there’s a consequence. As a personal aside, while I love the convenience of Netflix, a part of me misses the conversations about what was on the box last night – the shared communal experience of friends watching the same TV program at the same time.

Smart phone and social media


Most importantly, ten years on, the smartphone has become ubiquitous, even among the older age groups. Did you know that the average person in the UK now spends more than a day a week online? Ofcom, which compiled the report, attributes a large part of the surge in time online to the rise of smartphones, which are now used by 78 percent of the population compared with just 17 percent in 2008, the year after the first iPhone was launched. In fact, the average person now checks their phones every 12 minutes! And of course, not only are people watching video on their smartphones, they are also developing content themselves. Social media apps such as Snapchat, and Facebook’s live video option have also given amateur videographers and life-style bloggers an easy and cost-effective way to create and distribute content.

Anytime, any device & everywhere


As a result, media & entertainment is no longer connected to a place like a living room or cinema – it’s now available anytime and everywhere on any device. Boundaries are disintegrating as connected technologies turn everything fluid. Content creation and production are being challenged by a broad range of output, everything from 4K to consumer-generated footage, recorded by smartphones, while content delivery is expected to be instantaneous. Media production is also increasingly common in areas, such as training, museums, and education.

Mobile video advertising


The trend towards advertisement-free TV viewing has also forced marketers to look for alternative ways to reach viewers that they can no longer hope to attract through TV ads. As a result, more and more video content is being pushed online. Mobile video advertising is growing faster than other forms of digital advertising. With video being so prevalent on mobile devices, advertisers have adapted by creating videos with a vertical perspective to complement the way we hold our tablets and smartphones.

A move to open standards


What does all this mean for broadcasters and production houses? How are they investing, experimenting and innovating? How are they remaining relevant and keeping pace with this level of change? Unsurprisingly, our media & entertainment customers tell us it’s a fiercely competitive market. As a result, they need to stay ahead of the technology curve. Different technology platforms need to work together to make their production workflow as efficient as possible. As a result, I am seeing the industry move away from expensive, proprietary platforms to open standards. Increasingly, broadcasters and production houses are incorporating standard computing into their solutions to reduce costs, increase agility and exploit new revenue streams. This dynamic is pushing the top ISVs to develop their own out-of-the-box solutions by collaborating with IT companies like Dell Technologies, OEM & IoT Solutions.

Technology demands are high


3D animation, VR, compositing, grading and non-linear editing are now placing the compute focus firmly on workstations while 3D rendering and all aspects of video content creation, from acquisition and transcoding through to distribution, require huge amounts of server processing power. Digital video production, whether for TV or film, emphasize the storage part of the equation as well as the ability to view and edit data natively with 4K, especially now with 8K and HDR looming on the horizon. With multi-device consumption and ultra-high definition driving increased demand and generating more data, storage like Isilon is needed across all these workflows as a repository to manage content. And, of course, all these tasks require professional grade monitors with high resolution, precise color grading and industry relevant color coverage that reproduce data in high fidelity.

ATEME’s story


A customer story brings the picture to life so let’s take leading video compression company, ATEME. Responding to the rise in internet-based video, ATEME wanted to provide a converged, scalable, and virtualized video processing solution for its broadcasting customers. The goal was to increase compression efficiency and video quality while decreasing server footprint. Working with Intel and Dell OEM, ATEME successfully re-engineered its appliance platform. The results speak for themselves. ATEME reported 10 times the channel density, 80 percent reduction in delivery time and 25 percent reduction in maintenance overhead. Read the full story here. As Michel Artières, CEO, ATEME said: “We didn’t want just a supplier, we wanted a partner and Dell OEM went the extra mile by testing and fine-tuning the joint solution to reach the highest performances and the best efficiency.”

The road ahead


Looking ahead, the speed of change and the breadth of new technologies continues. 5G will create new business models that will see billons of devices consuming and generating data like never before. Adoption and usage of voice-enabled digital assistants is also growing, suggesting that voice could be the next big thing in human-computer interaction. Other emerging trends include AI, IPv6 protocols, virtual and augmented reality and IoT. To survive, the industry must continue to innovate rapidly.

Dell Technologies is the only company on the planet that has hardware and software solutions that play at every level, from the edge to the core to the cloud. We can deliver all the necessary assets, including scalable, secure, manageable and open infrastructure architecture, IoT and big data expertise, the ability to customize through our OEM division, the right partners plus a sophisticated global support and supply chain.

Tuesday 11 June 2019

Evolution at the Edge

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Tutorials and Materials

At Dell Technologies World this year, customers and journalists were curious about trends I am seeing in the marketplace and predictions for the future. I shared my views on the impact of 5G , how AI and IoT are continuing to intersect, and the need for businesses to have consistent, flexible infrastructure to quickly adapt. I also emphasized that the foundation of all these transformations is the shift to edge computing—and it’s our OEM & IoT customers across all industries who are leading this evolution.

Location, location, location


At this point, I should clarify what I mean by the edge. I’m talking about data being processed close to where it’s created, versus the traditional centrally-located data center. I like to think of the difference between the data center and the edge as the difference between living out in the suburbs and living in the city—where all the action is. Right now, about 10 percent of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. However, by 2023, Gartner predicts this figure will reach 75 percent. That’s a dramatic shift by any definition.

Three whys


So, why is this happening? Three reasons. First, according to the latest research, the number of connected devices is expected to reach 125 billion by 2030, which will put about 15 connected devices into the hands of every consumer. It simply doesn’t make sense to move all that data to a traditional data center—or even to the cloud.

The second reason is cost. It’s naturally more cost-effective to process at least some of the data at the edge. And third, it’s all about speed. Many use cases just cannot accept the latency involved in sending data over a network, processing it and returning a response. Autonomous vehicles and video surveillance are great examples, where even a few seconds delay could mean the difference between an expected outcome and a catastrophic event.

Edge computing examples


And what kind of compute exists at the edge? Well, it helps me to visualize the edge as a spectrum. On the right end–what I call the far edge–is where data is generated. Picture millions of connected devices generating a constant stream of data for performance monitoring or end user access. One example is a fluid management system, where valves need to be automatically opened or closed, based on threshold triggers being monitored. If this is something that interests you (using IoT data to help customers better manage and troubleshoot control valves), I recommend looking into our joint solution with Emerson.

Or, consider how the frequency of fridge doors opening in the chilled food section of a supermarket affects the fridge’s temperature levels, and ultimately the food. It would be crazy to send to the cloud such a massive amount of data simply indicating the binary safe/unsafe temperature status—the store manager only needs to know when the temperature is unsafe. So, the edge is the obvious choice to aggregate and analyze this kind of data. In fact, we’ve worked with a major supermarket retailer to implement refrigeration monitoring and predictive maintenance at their edge. Today, their cooling units are serviced at the appropriate time, and they’re saving millions of dollars in rotten food. If you’re interested in using data to help avoid food waste, check out our joint solution with IMS Evolve.

Application-driven solutions


Of course, in the vast majority of cases, the application determines the solution. For example, speed in surveillance systems is critical, when you’re trying to find a lost child in a mall or identify and stop someone that is a known security threat from entering a football stadium. The last thing you want at the crucial moment is for a cloud environment to tell you that it’s busy searching.

Thanks to the advent of 5G, carriers are addressing the need for higher data traffic performance by placing servers at the base of cell towers instead of at a regional data center. These are all examples where configuration capability, great graphics and high processing performance come into play. And this brings me to another interesting point. When edge computing started, dedicated gateways were the focus. While still important, that definition has expanded to include servers, workstations, ruggedized laptops and embedded PCs.

The micro data center


Another category of edge compute is what Gartner calls the Micro-Data Center. Many of the attributes of a traditional data center come into play here, such as the need for high reliability, ability to scale the compute as needed, and high levels of management. Conditions that don’t typically demand ruggedized products, but where space limitations are likely.

In these scenarios, customers typically consider virtualized solutions. Remote oil rigs, warehouse distribution centers and shipping hubs are great examples. Just think about the speed of packages flying down a conveyer belt at a distribution center, being routed to the right loading area while the data is being logged in real time for tracking. Batch files are then sent back to a central data center for global tracking, billing, and record keeping. In effect, you have a network of micro data centers at the edge, aggregating and analyzing data, while feeding the most relevant information into a larger regional center.

Looking ahead


In addition to all the practical benefits (such as faster speed, and lower cost), the edge is also driving fresh innovation. After all, the ability to glean immediate insights, experiment, respond in real time and deliver services on-demand are all important criteria in our ever-changing world. In my view, this dynamic will only accelerate with the advent of 5G. By increasing the speed of data analysis, 5G will inevitably increase edge adoption while, on the other hand, businesses using edge computing will experience the full benefit of 5G networks. Over time, I believe that this combination will inspire a slew of new and exciting applications for both the business and consumer markets.

Saturday 8 June 2019

All Aboard the Scripting Train: Moving from DTK to RACADM

For those PowerEdge server customers who use scripts to manage servers, you’re probably familiar with a longtime Dell EMC tool: the Dell Deployment Toolkit (DTK). What you may not know is that DTK’s days are numbered, and there’s a newer command-line tool that will do the job for many years to come: the Remote Access Controller Admin (RACADM) utility.

Dell EMC Study Materials, Dell EMC Learning, Dell EMC Guides, Dell EMC Certifications

As part of our commitment to intelligent automation, we want to make sure our customers can manage servers in the manner that best suits their IT environment, and for many customers, using scripts is their method of choice. RACADM provides a reliable means to do just that. So, to support this transition, we have prepared two new documents for you.

What was DTK for?


But first, a little background. The DTK is a set of utilities, scripts, and configuration files used to configure PowerEdge servers in both Windows and Linux environments. DTK will be sustained for current (that’s 14th generation PowerEdge servers like the R740, as an example) and earlier supported platforms until those platforms pass their end of support life (EOSL) threshold. Dell EMC will not offer support for DTK on future platforms.

We strongly recommend that if you are using DTK now, that you start learning about using the RADADM utility.

What can RACADM do?


RACADM is a command line tool that allows for remote or local management of Dell EMC servers via the integrated Dell Remote Access Controller (iDRAC) that is embedded in every PowerEdge server. RACADM provides functionality similar to that provided by the iDRAC’s web interface. Another plus for RACADM users is that the Dell Chassis Management Controller (CMC), the embedded management device for blade server chassis, can also be managed remotely. RACADM commands can be run remotely from a management station and/or locally on the managed system. RACADM commands allow operations such as viewing system information, performing power operations, firmware updates, and configuring settings. Since RACADM is run from a command line interface (CLI), system administrators can create scripts that control and update Dell multiple systems at the same time.

To support your move to RACADM, Dell EMC engineers have written two new documents for existing DTK users to help guide them to and through a smooth transition to RACADM. The first is a guide that can be used as a reference manual for using DTK with supported PowerEdge servers and provides transition guidelines for RACADM deployment and configuration in Windows and Linux environments. The second is intended to be used as a reference manual to map RAIDCFG (a DTK utility used for storage configuration) operations into equivalent RACADM command syntax for PowerEdge Raid Controllers (PERC). These RAIDCFG operations are supported in the PowerEdge server platforms through the current 14th generation systems.

Thursday 6 June 2019

Power to our Partners

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Study Materials, Dell EMC Learning

We welcomed roughly 5,000 partners to Las Vegas for our annual Global Partner Summit – our flagship event of the year. I spent the week meeting with partners from around the globe and emphasized three main messages; we are winning, we are investing, and we are innovating. The response was encouraging. To put it simply, there’s really no reason not to be partnering with Dell Technologies.

We are winning.


The momentum we’ve generated over the last year has been stellar. Already the market leader, we took even more share in data protection, even more share in hyperconverged infrastructure and even more share in external storage. Four quarters in a row of storage share gains. Furthermore, we accounted for over half the storage market growth or $1.3B1 in 2018. We are truly defying gravity with you, our partner community!

We are investing.


At Dell Technologies, we aim to have the #1 program in the eyes of our partners. And this past year, you made more money than ever selling storage with Dell Technologies. This validates the hard work and the investments we’ve made toward designing storage programs to grow our collective businesses.

Moving forward, we will continue to invest in the programs that resonated best with all of you.

We are innovating.


Finally, we made a number of product announcements at Dell Technologies World. Our innovation engine has been working at warp speed to deliver in the areas that you told us we need to improve in.

Let’s start with cloud. Partners and customers alike have been asking us for more integrated solutions spanning the SABs and we delivered. We announced the Dell Technologies Cloud; a platform where VCF is integrated onto our VxRail products, deployable on prem. And beginning in the second half of the year, we’ll offer the Dell Technologies Data Center as a Service with the Vmware cloud on Dell EMC.

On top of all that, to deliver even more choice for our partners and customers, we’ve expanded our partnership with Microsoft and can now deliver a fully native, supported, and certified VMware cloud infrastructure on Microsoft Azure.

And what about Storage? Summarized by Jeff Boudreau here, our latest storage innovations will better enable customers to maximize the value of their data capital.

In midrange, the new Dell EMC Unity XT has been refreshed and drives 2X more IOPS with NVMe-ready architecture and delivers greater data efficiency with up to 5:1 data reduction.

In high end, Dell EMC PowerMax, now delivers even better performance and will be the first true scale-out storage array2 to ship with Intel Optane DC drives used for persistent storage by year end. Also, PowerMax has expanded automation and container support.

In unstructured, the latest Isilon OneFS 8.2 provides up to 75% greater capacity and performance to support the most demanding file workloads, enhanced cloud integration with Google Cloud and Alibaba and increased security.

In data protection and data management, the new Dell EMC PowerProtect Software platform and the multi-dimensional Dell EMC PowerProtect X400 appliance (an industry first!) deliver simplified management, multi-cloud protection, automation services and cyber recovery. ​

Safe to say, our position in the marketplace is the strongest it’s ever been and our portfolio just further expands upon what we can offer our customers. There’s no better time to partner with Dell Technologies. And as Michael Dell said in his keynote, “this is just the pregame show!”

Tuesday 4 June 2019

New ‘Experience Zones’ Offer a Fast Route to AI Expertise

New Dell EMC AI Experiences Zones showcase the business benefits of artificial intelligence and provide ready access to the latest Dell EMC AI solutions.

Dell EMC Guides, Dell EMC Study Materials, Dell EMC Certifications

Organizations around the world now recognize the opportunity to put artificial intelligence to work to solve pressing business problems. In one sign of this growing AI momentum, a recent IDC report predicts that worldwide spending on AI systems will jump by 44 percent this year, to more than $35 billion.

This push into the brave new world of AI isn’t confined to just certain industries. It’s across the board, according to IDC. As one of the firm’s research mangers notes in a news release, “Significant worldwide artificial intelligence systems spend can now be seen within every industry as AI initiatives continue to optimize operations, transform the customer experience, and create new products and services.”

Clearly, when it comes to AI, organizations are ready to seize the day. And here is where things get harder. Now that people have bought into the vision, the challenge is to turn great ideas into great AI systems that deliver measureable business value. To get there, organizations need to gain experience with AI applications and the high-performance computing systems that drive them.

All are invited to the new Dell EMC AI Experiences Zones! These hot spots for immersive AI experiences give Dell EMC customers and partners a chance to gain a comprehensive understanding of AI technologies and advancements, as well as practical, hands-on experience with the design and deployment of AI solutions. Along the way, the AI Experience Zones show how organizations can leverage the Dell EMC HPC and AI ecosystem to address today’s business challenges and opportunities across a wide range of industries.

The AI Experience Zones, launched in partnership with Intel©, place a strong emphasis on simplifying AI deployments. Through masterclass training, AI expert engagements and collaboration opportunities that are available on-site, users are guided through the necessary steps to kick-start AI initiatives within their organizations — including design, installation, maintenance and, most importantly, the delivery of tangible business outcomes.

The Customer Solution Center connection


Dell EMC Guides, Dell EMC Study Materials, Dell EMC Certifications
The new AI Experience Zones are an extension of our Customer Solution Centers, which are found around the world. These centers give organizations a chance to gain firsthand experience with the latest and greatest Dell EMC solutions and products, along with offerings from other Dell Technologies companies.

Via a customized Customer Solution Center engagement, your organization can work directly with our subject matter experts in our dedicated labs. Remote connectivity enables you to include global team members in the CSC experience, or to work with us entirely from your own location, as you plan and implement your digital transformation strategy — and work to bring your ideas to life.

Let’s get started


The new Dell EMC AI Experience Zones are up and running in Singapore, Seoul, Sydney and Bangalore, and expanding. Organizations outside of the Asia Pacific and Japan region can get in the game by working with Dell EMC Customer Solution Centers in their geographies to gain hands-on experience with AI technologies and solutions.

To get started with your own AI experience, talk to your Dell EMC representative about accessing the resources of an AI Experience Zone or a Customer Solution Center. And to learn more online, visit our Customer Solution Centers site.

Saturday 1 June 2019

Where Were You When Artificial Intelligence Transformed the Enterprise?

Where were you when artificial intelligence (AI) came online? Remember that science fiction movie where AI takes over in a near dystopian future? The plot revolves around a crazy scientist who accidentally put AI online, only to realize the mistake too late. Soon the machines became the human’s overlords. While these science fiction scenarios are entertaining, they really just stoke fear and add to the confusion to AI. What enterprises should be worried about regarding AI, is understanding how their competition is embracing it to get a leg up.

Where were you when your competition put Artificial Intelligence online?

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Tutorials and Materials

Artificial Intelligence in the Enterprise


Implementations of artificial intelligence with Natural Language Processing is changing the way enterprises interact with customers and conduct customer calls. Organizations are also embracing another form artificial intelligence called computer vision that is changing the way Doctors read MRIs and the transportation industry. It’s clear that artificial intelligence and deep learning are making an impact in the enterprise. If you are feeling behind, no problem, let’s walk through strategies enterprises are embracing for implementing AI in their organizations.

Key Strategies for Enterprise AI


The first step to embracing AI into your organization is to define an AI strategy. Jack Welch said it best “In reality, strategy is actually very straightforward. You pick a general direction and implement like hell.”  Designing a strategy starts with understanding the business value that AI will bring into the enterprise. For example, a hospital might have an AI initiative to reduce the time necessary to recognize patients experiencing a stroke from CT scans. Reducing that time by minutes or hours could help get critical care to patients and ultimately deliver better patient outcomes. By narrowing and defining a strategy, Data Scientists and Data Engineers now have a goal to focus on achieving.

Once you have a strategy in mind, the most important factor in the success of artificial intelligence projects is the data. Successful AI models cannot be built without it. Data is an organizations number one competitive advantage. In fact, AI and deep learning love big data. An artificial intelligence model that helps detect Parkinson’s disease must be trained with considerable amounts of data. If data is the most critical factor, then architecting proper data pipelines is paramount. Enterprises must embrace scaled out architectures that break down data silos and provide flexibility to expand based on the performance needs of the workload. Only with scale-out architectures can Data Engineers help unlock the potential in data.

After ensuring data pipelines are architected with a scale-out solution, it is time to fail quickly. YES! Data Scientists and Data Engineers have permission to fail but in a smart fashion. Successful Data Science teams embracing AI have learned how to fail quickly. Leveraging GPU processing allows Data Scientists to build AI models faster than anytime in human history. To speed up the development process though failures, solutions should incorporate GPUs or accelerated compute. Not every model will end with success, but will lead Data Scientists closer to the solution. Ever watch a small child when they are first learning how to walk? Learning to walk is a natural practice of trial and error. If the child waits until he/she has all the information and the perfect environment, they may never learn to walk. However, that child doesn’t learn to walk on a balance beam, it starts in a controlled environment where she can fail. A Data Science team’s start in AI should take the same approach, embracing trial and error while capturing data from failures and successes to iterate into the next cycle quickly.

Dell Technologies AI Ready


The journey may seem overwhelming. However, those forward-thinking enterprises who take on the challenge in AI will gain market share. Dell Technologies is perfectly placed to guide customers through their AI journey with services to help with an Artificial Intelligence strategy, to the industry leading AI solutions like the Dell EMC Ready Solutions for AI and Reference Architectures for AI. These AI solutions give you informed choice and flexibility on how you deliver NVIDIA GPU accelerated compute complemented by Dell EMC Isilon’s high performance, high bandwidth scale-out storage solution which simplifies data management for training the most complex deep learning models.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Tutorials and Materials
Click to watch my coffee conversation with Sophia