Sunday, 18 March 2018

Dell EMC Data Protection for VMware Cloud™ on AWS

Dell EMC Data Protection, VMare Cloud, Dell EMC AWS, EMC Certifications, Dell EMC Tutorials and Materials

Introducing a Single Product to Bundle Data Protection with VMware workloads in Amazon!


As more organizations continue to move applications and data to the cloud, the value of data protection is now more important than ever. Solid and reliable data protection workflows guarantee data is always available and ready for recovery when needed, and ensures little to no downtime for our customers’ business operations. For our customers virtualizing with VMware in the cloud, Dell EMC makes it easy to now protect VMware workloads on Amazon Web Services (AWS) with an all-in-one bundle that is cost effective and provides simplicity in purchasing and management.

Top Customer Advantages of the Dell EMC Data Protection for VMware Cloud™ on AWS” Bundle:


◈ Purchased as a single product that includes all necessary Dell EMC software for backing up and recovering VMware Cloud on AWS workloads

◈ Similar to VMware Cloud on AWS, bundle is priced on a per host subscription model

◈ 1 or 3 year subscriptions are available for flexible procurement options

◈ Industry leading Data Domain deduplication to reduce backup storage capacity needs

◈ vSphere integration and attractive pricing that makes it painless to protect VMware Cloud™

Top Partner Advantages of the Dell EMC Data Protection for VMware Cloud™ on AWS” Bundle:


◈ Simplifies sales campaign by providing one complete solution to sell

◈ Improves sales cycle by reducing number of resources that need to be engaged

◈ Enables partners with upsell opportunities by selling more value with pre-bundled solution

Top Solution Benefits of Running Dell EMC Data Protection for VMware Cloud™ on AWS:


◈ Proven enterprise data protection for the enterprise public cloud

◈ Seamless integration with on-premises data protection

◈ Industry’s best-in-class deduplication leads to lower consumption costs

◈ Protects VMware workloads on AWS storage for increased resiliency

◈ Natively integrates into VMware management tools for the ultimate automation experience

We are excited to go big and win big with you this year! Good selling!

Dell EMC Data Protection, VMare Cloud, Dell EMC AWS, EMC Certifications, Dell EMC Tutorials and Materials

Friday, 16 March 2018

10 Parallels Between Whiskey Tasting and Data Analytics

EMC Tutorials and Materials, EMC Guides, EMC Certifications, DELL EMC Learning

In today’s world, the power of data analytics is everywhere. From agriculture to healthcare, from shopping to dating, from the vehicles we drive to the way we do business, our experiences are increasingly shaped by data analytics. This is true even when it comes to whisky tasting, although in this case the analytics process is driven by our senses and our reasoning rather than sophisticated algorithms.

This is a topic that is close to my heart, given that I’m a director of data analytics who moonlights as a whiskey sommelier. I often have occasion to reflect on the amazing parallels between the principles of data analytics and the process of tasting whisky.

With that thought in mind, let’s look at 10 of the ways in which data analytics and whiskey tasting share common ground.

Let’s take a step back and see how we got here.

Back in the 1980’s, when data warehouse vendors like Teradata provided the ability to pool data, business owners asked even more demanding questions. Then SAS and SPSS, whose origin owes to government and academic interests, developed tools that allowed for “what will happen” questions and not just “what happened.” Fast forward to now. Fueled by math smarts and entrepreneurial spirit, we now expect Amazon to recommend books when we shop or Uber to send strangers with an empty seat to our address. Doubt what I’m talking about? Ask Siri or Alexa.

Back to whiskey. By 1954, the U.S. saw the number of distilleries collapse into four companies. Courage and curiosity brought back independent distillation following the rise of microbreweries in the 1970’s. Pioneer Tito Beveridge planted the craft distillery flag in Texas in 1997 after he observed the first seedlings in Kentucky and Tennessee. What followed was a wave of craft distilleries with shoots emerging in California, Texas, New York, Colorado and Washington. It’s now a multi-billion business with 27.4 percent growth.

Amplifying this trend, cocktails popularized by shows like Mad Men or House of Cards put whiskey in our collective consciousness. We should no longer expect only wine to come in flights either. We can skip our way through whiskeys too. We arrived at the new normal: whiskey tours, bourbon runs and the rise of The Whiskey Sommelier. Whiskey tastings now pop up like daisies in a sun-drenched field. Whiskey tasting is an active sport involving all five senses and your brain.

Now let’s loop back to data analytics and look at the threads that tie these two worlds together. In particular, let’s look at 10 reasons why data analytics and whiskey tasting share common ground.

1. Deductive Reasoning


When online retailers look for patterns in clickstream data, they engage a practice known as deductive reasoning. I see this and I see that. Therefore this other thing is highly correlated. Ask people at Walmart why they stock strawberry Pop-Tarts in the front of the store before a hurricane. They will tell you it was because they saw a pattern.

Same is true for experiencing whiskey. An active whiskey drinker analyzes what she experiences. Does she note a familiar herb like heather in all of the Speyside single malts? She is looking to establish a premise AFTER collecting data. She is not making a grand statement like “all Speyside single malts have heather” after tasting just one. Inferring generalities from specifics then running experiments to prove the theory is inductive reasoning.


2. Feature Detection


Data analytics pursues feature detection to find what will predict an outcome. Features are like column headers in a spreadsheet. Lenders inspect aspects of home mortgage applications to see what attribute or combination of attributes will shine a light on those who are worthy. This process is not unlike whiskey tasting.

Consider tasting wheels. They are just circular spreadsheets. Each spirit can be ranked in intensity to those specific flavors. Those features originate from grain selection, fermentation, distillation, barrel type and aging. These are puzzle pieces that whiskey lovers adore assembling in their minds. They are getting to know the whiskey like characters in a novel.

3. Classification


One of the short cuts to dealing with large populations is to bucketize them into groups. The Boomer, Gen-Xer and Millennial labels are nothing more than a classification exercise based on birth years. We make generalizations about each group’s interests. Consider that Red Bull has the Millennials in their cross hairs while Metamucil aims at the gray hairs. This classification technique works well with whiskey comparisons too. Bourbon by law needs to have 51 percent corn. So in a blind test comparing spirits from different grains, *look* for the candy corn aroma. It’s a signature to this class of whiskey.

4. Propensity


Propensity is a fancy term to describe what might likely happen. It’s how data analytics deal with the unknown. We see a drop in the price of oil and the propensity for Houstonians to leave the family cell phone plan rises. The same principle underpins whiskey when it comes to food pairing. Chicken piccata, an Italian dish served with a lemon and caper sauce, is likely to go well with a Rye. Why? Because the rye grain has lemon on the nose. So the propensity for the match is high.

5. Iteration


Data analytics professionals, just like master distillers, like to experiment before they lock down on a data model. They have ideas and tweak as they go. Internet properties like Facebook play with the shade of blue to see which gets the most clicks. So why not expect that with whiskey? Whiskey blenders like Compass Box continue to push the envelope for their blended malts. They were famously cornered by the Scottish Whiskey Association for a rather unconventional aging process in the original recipe of Spice Tree.

6. Establishing a Baseline (Supervised Learning)


We know what is normal for blood pressure because doctors have measured this vital sign for years. More than that, they have correlated both positive and negative outcomes to the data. They know a patient is high risk because of the histories of hundreds of thousands of patients. The role of data analytics is to determine what is “normal” based on a given data set. However, normal becomes useful when we know the outcome of a certain event too. In the land of data analytics, when we establish a baseline with known outcomes and ask algorithms to pick out things that predict the future, we are engaged in supervised learning.

The act of building a baseline for whiskey tasting comes from personal experience. Blind tasting after blind tasting helps the taster single out the single malts from the blends. A corn mashbill from rye or barley. Secondary casking versus single. The more a whiskey taster experiences, the bigger the sample set, the broader the foundation, the more the taster knows. This foundational knowledge helps whiskey tourists know when they have left the paved road and are launched on an adventure.

7. Anomaly Detection


Anomalies get a bad rap. That is until you understand that all parents want their kids to be normal, but never average. Being above average earns gold medals on the downhill and early acceptance to that hard-to-get-into college. This is not normal. Seeking anomalies is the job of talent scouts. It is also core to data analytics because it’s something from which we learn. It might be the use of a product the designer never expected. Ask the Pfizer about its original intent for Viagra.

The world of whiskey tasting presents a similar opportunity. Greenspot Irish Whiskey has green apple all over it. Westland Single Malt tastes like chocolate. And Hudson Four Grain has a barnyard quality to it. You can almost hear the sheep. When you hang notes in the air like Pavarotti, you get noticed. Anomaly detection is a different kind of appreciation. Whiskey aficionados aim for this.

8. Normalization


Eighty percent of a data scientist’s time is spent wrangling data — filling in the missing elements in a table so the columns and rows are ready to be analyzed. It is hard to draw conclusions when the artifacts are missing. And this same rigor is pursed by the whiskey trade. Evaluating whiskey must be done if and only if the spirits are served in the same way and the same time.

We know that wine oxidizes in the glass. An angry glass of cabernet becomes approachable after an hour once it breaths. Time also plays into whiskey, except oxygen is not the factor. When alcohol evaporates from the glass precious olfactory volatiles escape with them. A whiskey freshly poured might be feisty at five minutes and friendly at 50. So it’s important that we treat data and whiskey with the same level of consistency: same glass shape, same pour size, same time out of the bottle.

9. Enrichment


Business data only gets better when we add diverse data types like geo-spatial, event-related or weather data to it. When a Texan shops online at happy hour on Cinco de Mayo and the cart was abandoned, there should be no surprise. Oh, it was sunny that day. This context-driven awareness adds a deeper understanding. Modern analytics is all about enriching structured data with unstructured to gain a better experience.

Likewise distillers are aging whiskey in second or third casks. They take a completed product and finish it off in wine, sherry, port, rum or Sauternes casks. Or it might take the form of the *blenders’ art* like Johnny Walker, Dimple or Compass Box. These distillers ladder up the experience by marrying whiskeys from different places. Hudson Four Grain ages the same spirit in three different sizes of barrels each with a different char to get that exact expression.

10. Shaped by Taxation


Fans of history have no trouble remembering when Alexander Hamilton rode to Western Pennsylvania in 1794 to help his boss lay down the Whiskey Rebellion. This was before Mr. Hamilton was a Broadway sensation. He was our first secretary of the treasury and was hungry to pay down the national debt with a whiskey tax <gasp>. Our friends in Scotland struggled with same issue in the mid-18th century where distillers were taxed based on still size and not production. In 1787 taxation was the tipping point in the split between the Lowlands and the Highlands. Religion, language and affiliation with England may have proper historians talking, but a true Gaelic Highlander knew the real argument was over whether blended whiskey was a *real* whiskey or swill to appease the English.

Likewise analytics in the English-speaking world found its voice because of taxation. First appearing in Britain in Roman times, the practice became a consistent effort in 1801.  Its mission was for allocating precious resources. And since counting people one by one takes longer than a single decade, statistics found its place in economics.

So the next time you raise a Glencairn glass of the *water of life*, just remember that as you ponder notes of heather, sea air and the smell of warm biscuits, you might actually be thinking like a data scientist.

Wednesday, 14 March 2018

After Meltdown – Best Practices for Updating Your PowerEdge Server’s BIOS

EMC Tutorials and Materials, EMC Guides, EMC Certifications, EMC Learning

The recent news of side-channel analysis vulnerabilities affecting many modern microprocessors has, as you can imagine, generated more than a few inquiries from our customers about updating their PowerEdge servers. If you’re in the same boat, asking yourself “What comes next? How do I apply these BIOS updates?”, then this post should help.

First things first, applying a BIOS update to a PowerEdge server is easy. Dell supplies different tools so you can choose the method best suited to your particular IT environment and needs.

Updating One or Two Servers?


If you’re just updating one or two servers in a small shop, a BIOS update packages can be obtained from support.dell.com manually by keying in your server’s system tag and then looking for a BIOS update such as that shown in figure 1.

EMC Tutorials and Materials, EMC Guides, EMC Certifications, EMC Learning

Fig 1 – support.dell.com showing a BIOS update for PowerEdge server

NOTE: Dell EMC downloads and driver updates are free. That’s always been the case and there are no plans to change that.

Downloading this file and then applying it manually to a local server is straightforward, but if you have hundreds or more servers in a remote data center you’ll want to keep reading because we have better options for you.

Updating Lots of Servers, Even Automatically


Intelligent Automation is a Dell EMC hallmark, and Dell EMC offers a range of OpenManage solutions that can simplify mass server updates. With Dell EMC Repository Manager, new updates from Dell EMC online catalogs can be automatically downloaded, as shown in figure 2.

EMC Tutorials and Materials, EMC Guides, EMC Certifications, EMC Learning

Fig 2 – Dell EMC Repository Manager interface

You can tell Repository Manager when to download updates, which servers you own, and what kind of updates you want. You can also command Repository Manager to download different sets of updates for different logical or physical groups of servers, and then to separate them into repositories in different locations. This gives you the flexibility to support different deployment methods.

So now you have a BIOS update. You’ve tested it and you want to deploy it to the production servers in your datacenter. Now what? Dell EMC recommends one of the following approaches to automate updates:

◈ Use OpenManage Essentials or OpenManage Enterprise
◈ Use an OpenManage integration for either Microsoft System Center or VMware vCenter
◈ Create a custom automation script that operates with standard management APIs provided by the iDRAC with Lifecycle Controller embedded in every PowerEdge server.

As an example, OpenManage Enterprise, the next-generation Dell EMC management console, provides a simple click-and-go process to schedule and perform BIOS updates for thousands of servers (see figure 3).

EMC Tutorials and Materials, EMC Guides, EMC Certifications, EMC Learning

Fig 3 – OpenManage Enterprise screen showing target servers to update

Those systems will process the update as scheduled and with no further intervention. If you’re new to managing PowerEdge servers, this is an easy way to efficiently update thousands of servers without a lot of effort.

If you already manage your IT environment with an existing management platform such as System Center or vSphere, our integrations and connections make short work of incorporating PowerEdge servers.

And you use scripts to perform IT operations, we offer resources on Dell TechCenter as well as open source PowerShell and Python Scripting repositories http://github.com/dell. These assets provide a good starting point for automation, and can be adapted to the specifics of your IT environment.

Dell EMC Advantage: Dell EMC provides the tools to deploy updates in a manner that best suits your needs. We realize that one method does not fit all situations.

Sunday, 11 March 2018

Mobile World Congress and the Critical Role of Specialist Telecoms Companies

Not quite the headline you’d maybe expect to see from a company that is big into promoting open standards but let me explain. It’s true that we continue to see a massive shift in the industry away from proprietary, expensive IT equipment to standardised, cost-efficient computing blocks.

EMC Tutorials and Materials, DELL EMC Guides, EMC Certifications

Within the industry, Dell EMC OEM is now regarded as an essential infrastructure partner, providing the IT foundational platform upon which the telco solution is built. However, that doesn’t mean that specialist companies have gone away and are no longer required. On the contrary, their skills continue to be highly relevant and in demand.

Network Virtualisation


For example, take network virtualisation. A hot topic for some time, this has featured prominently in labs work and proof of concept designs, but we are now seeing service providers deploying network virtualisation infrastructure in the field. As you know, deploying a network involves everything from antennae, base stations, edge computing, IoT, core switching, transmission, operations support, business support, analytics, performance management, customer experience and more. While there are lots of component parts, each of these elements needs to work together in tandem plus the network must be always available. Given this complexity, it’s obvious that rolling out a network is a specialist activity.

The important role of specialist companies


And so, while network infrastructure costs are reducing thanks to the use of standardised IT components, I firmly believe that installation, support and SLA will continue to be the domain of specialist companies. After all, it’s not just a question of installing a server and software and off you go – each installation must be supported with an SLA functional guarantee. Specialist companies such as Ericsson and Nokia have huge expertise in installing and supporting networks. These specialist skills will continue to be in demand as virtual networks continue to be built out, using standard compute infrastructure.

Horses for courses


In fact, Dell EMC OEM is already deeply involved in supporting Ericsson and Nokia in the deployment of virtualised networks, based on standard infrastructure components. I see these relationships as key to the successful roll out of modern telecommunications networks. No-one vendor can deliver all – we need horses for courses and each party brings value-add to the table. The important word is partnership.

On that note, I’m looking forward this week to meeting representatives from the entire telecoms ecosystem, including service providers, telecom equipment manufacturers and network equipment providers. I’d love to hear your comments and predictions about the future of the industry. Do visit our booth in Hall 3, Stand 3K10 where we are showcasing the following solutions:

Edge Solutions


◈ View the newly designed micro Modular Data Center (MDC) – debuting at Mobile World Congress – and learn how you can embed compute and storage capacity at the edge where data is being generated.
◈ Re-imagine the customer edge with new universal CPE platforms and SD-WAN Ready Node solutions.

Core/Cloud Solutions


◈ How you can bring the cloud to the network with our NFV solutions and Telco Cloud offerings.
◈ Experience Dell EMC’s larger MDC capabilities with a virtual and interactive tour. Put on a headset and be transported to one of our latest MDC designs, the Flex Module.
◈ View our open and flexible rack scale infrastructure, the DSS 9000, and see how Dell EMC is enabling NEBS-compliant rack scale solutions.

IoT Solutions


◈ How Dell EMC Isilon scale-out NAS and Elastic Cloud Storage (ECS) solutions provide highly efficient edge-to-core-to-cloud storage with built-in analytics to unlock the value of your IoT data.
◈ Discover how Dell IoT Gateways transform Fleet Management by eliminating machine to machine telematics silos, for more cost savings; increased customer satisfaction and safety; and improved employee performance.
◈ How Dell IoT is revolutionising the building services sector and facilities management by transforming high energy costs into savings with a powerful, integrated intelligent building solution.

Thursday, 8 March 2018

For Communications Service Providers, Digital Transformation Requires Right Partner

It’s accepted knowledge across industries that companies that don’t undergo a digital transformation will find it difficult to survive in the coming decade. Legacy technology simply can’t support the performance and virtualization that businesses need to operate efficiently and provide modern products and services to their customers.

EMC Certifications, EMC Guides, EMC Tutorials and Materials, EMC Learning

But demand for modern infrastructure really begins upstream, with the Communications Service Providers (CoSPs) that own the networks powering business connectivity. The problem is that many large CoSPs are still operating on a wide range of proprietary, legacy technologies themselves. These technologies require a large number of people to maintain and operate them. In addition, these technologies deliver network speeds and responsiveness that are less-than-optimal for the businesses downstream.

To start the transformation process based on this starting point, CoSPs have the seemingly insurmountable task of becoming virtualization experts, sorting through hundreds of vendors and products to architect the ideal infrastructure, and implementing the new technology in an optimal way, all without disrupting existing services.

More realistically, CoSPs need a reliable, knowledgeable partner to help them set a digital transformation strategy, prioritize and select technologies, and undergo digital transformation in a way that sets them up for success.

EMC Certifications, EMC Guides, EMC Tutorials and Materials, EMC Learning

5 Key Focus Areas


CoSPs’ most pressing need (and opportunity) is to infuse infrastructure with more cloud technology to make it faster, more responsive and more automated. To do so, they need to adopt a significant amount of compute and virtualization technology across nearly every aspect of their infrastructure, starting with the following five areas:

◈ CoSP cloud – Central Offices need to evolve beyond physical appliances to provide cloud-based services to customers. This means upgrading to virtual appliances, then implementing virtual network functions, including software-defined networking (SDN). This will serve as a mechanism to stitch services together as well as help scale the networking topology between virtual functions.

◈ Next-gen access – Today’s companies need higher bandwidth to support their day-to-day operations and provide products and services in a fast and reliable way to their own customers. Providing next-gen access typically means migrating from static and expensive multiprotocol label switching (MPLS) virtual private network (VPN) circuits and physical customer premise equipment (CPE) nodes to more virtualized CPE nodes and secure access technologies, along with software-defined wide area networking (SD-WAN).

◈ Operations and business support systems (OSS/BSS) transformation – CoSPs need to make it faster and easier to launch new services to customers by incorporating automation and telemetry and ensuring the systems they use to deliver network-based services have application plug in (API) -driven capabilities.

◈ Edge computing – To deliver services more rapidly across widespread markets, CoSPs will need to adopt enterprise edge computing in the next 12-18 months. There are a number of approaches for doing this, from evolving the Central Office with architectures such as Central Office Architected as a Data Center (CORD), to building an edge services cloud incorporating capabilities such as multi-access edge computing (MEC) to the evolution of the edge outside of existing physical facilities with modular data centers.

◈ 5G infrastructure – When 5G becomes available in the next 18-36 months, CoSPs will be tasked with a new set of challenges. The requirements of 5G are roughly between 100-1000 times the performance and scale of 4G, at 1/1000th the latency, with significantly different economics on the monetization and operations. SDN will no longer be contained within the Central Offices, and CoSPs will need to embrace end-to-end SDN principles, such as network slicing. Network functions virtualization (NFV) will no longer be a centralized function running inside a virtual machine (VM), but inside containers or even running on top of bare metal.

The Partner CoSPs Need


Dell EMC makes digital transformation much easier for CoSPs. Not only are we a worldwide leader in compute and cloud-enabled IT infrastructure, we have the partnership framework in place to strategically and holistically guide CoSPs through the process of modernization across all five key areas.

Our experts give CoSPs the technology and tools to assemble the right combination of infrastructure and service capabilities to serve their business customers and remain competitive for years to come. Dell EMC’s focus on open-standards-based, disaggregated architecture means CoSPs won’t relive the mistakes of the past, trading proprietary solutions and vendor lock-in for a flexible, future-ready, scalable architecture.

The harsh reality is that most CoSPs won’t achieve the levels of virtualization and optimization they need without the right partner on their side. Dell EMC is poised to play a pro-active role in reshaping the future for service providers as they achieve digital transformation and provide the modern technology that will power the coming evolution of business.

Friday, 2 March 2018

Home Thoughts from Abroad at Mobile World Congress

Transformational Change and the Telecom Industry

EMC Guides, EMC Tutorials and Materials, EMC Learning, DELL EMC

These changes usually happen quite abruptly and are typically caused by shifts in usage patterns or the disruptive entry of a new business case when the priorities of yesterday may be rendered irrelevant. You only have to remember WhatsApp, and how almost overnight, it destroyed the SMS text business model. Of course, the industry has been evolving for years. We’ve moved from the remote sending of messages or voice communication by phone to today’s focus on connecting technology to people via devices, or the Cloud.

Network Virtualisation


Let me use an example that may feel more familiar. Telecom infrastructure (in terms of compute, storage and networking) used to be regarded as a purely physical thing. Something to be consumed by different types of applications. The industry traditionally built appliances with infrastructure, middleware and workloads. However, with the advent of NFV, workloads have now become virtualised, delivering greater flexibility, quicker time to market and smarter use of resources.

Workload management with the Cloud and the Edge


While some companies were in a technology race to be first out of the gate with a virtualisation stack and other technologies, I am glad that the focus throughout has remained firmly on resources in the infrastructure, and more importantly, the box. With the introduction of Cloud on one side and Edge on the other, we are now seeing a new transformation. Workload management, in its various guises, is rightly becoming the focus for Telecom and NFV rather than worrying about what the workloads run on, or what stack is being used.

Software-defined infrastructure


As a result, we are seeing the emergence of Software-defined Infrastructure (SDI) –  the concept of allocating bare metal resources in geographically distributed sites and grouping them together to manage in a virtual datacentre. The advantage of SDI is that it can place workloads in either private or public Clouds to maintain data integrity while increasing speed and efficiency.

I think that this transformation is being driven by the fact that NFV is not moving towards the homogenous execution environment that was expected some years ago. Instead, it is moving in the opposite direction with more variants of virtualisation, like containers as well as the need for bare metal execution of workloads.  Added to this, we are also seeing an increased need to place workloads closer the end-user for latency purposes and to deliver a better user experience, as well as the movement of workloads towards the Cloud for scale and economy. This is all without changing the environment or redeploying the products. I think that this development is pretty remarkable.

A software-defined future


In fact, I believe that we might well be seeing the real emergence of a software-defined future, where flexibility is fulfilled by automation, orchestration, policy, analytics and reporting.  After all, a large share of the potential value coming from digitisation across global industries over the next decade is dependent on the telecom industry delivering productivity improvements. According to the 2017 World Economic Forum, the digital transformation of telecommunications represents a $2 trillion opportunity for industry and society.

Interesting times ahead! I’d love to hear your comments, predictions and questions. Click here to read what my colleague, James Hole from Dell EMC OEM has to say on the role of specialist telecom companies.  Click here to read the views of our marketing lead for OEM Telecom solutions. Finally, if you’re at Mobile World Congress, we’d really love to meet you! Do visit our booth in Hall 3, Stand 3K10 where we are showcasing the following solutions:

Edge Solutions


◈ View the newly designed micro Modular Data Center (MDC) – debuting at Mobile World Congress – and learn how you can embed compute and storage capacity at the edge where data is being generated.

◈ Re-imagine the customer edge with new universal CPE platforms and SD-WAN Ready Node solutions.

Core/Cloud Solutions


◈ See how you can bring the cloud to the network with our NFV solutions and Telco Cloud offerings.

◈ Experience Dell EMC’s larger MDC capabilities with a virtual and interactive tour. Put on a headset and be transported to one of our latest MDC designs, the Flex Module.

◈ View our open and flexible rack scale infrastructure, the DSS 9000, and see how Dell EMC is enabling NEBS-compliant rack scale solutions.

IoT Solutions


◈ See how Dell EMC Isilon scale-out NAS and Elastic Cloud Storage (ECS) solutions provide highly efficient edge-to-core-to-cloud storage with built-in analytics to unlock the value of your IoT data.

◈ Discover how Dell IoT Gateways transform Fleet Management by eliminating machine to machine telematics silos, for more cost savings; increased customer satisfaction and safety; and improved employee performance.

◈ Learn how Dell IoT is revolutionising the building services sector and facilities management by transforming high energy costs into savings with a powerful, integrated intelligent building solution.