Friday 31 August 2018

EPEAT: Dell EMC PowerEdge Servers Further Validated as the Responsible Choice

Those who know the Electronic Product Environmental Assessment Tool (EPEAT) registry know it is one of the best indicators of sustainably produced electronics. The Green Electronics Council is expanding their registry to include a new server category, and customers can find multiple Dell EMC PowerEdge servers from the start.

Why EPEAT matters


Since it first launched more than a decade ago, EPEAT has grown into a trusted procurement tool for making responsible purchasing decisions. In addition to the newly launched server category (based on the NSF/ANSI-426 standard), EPEAT has separate registries for computers and displays (monitors), imaging equipment, phones and TVs. To qualify to EPEAT, a server must meet a core set of requirements in eight sustainability categories:

Dell EMC PowerEdge, Dell EMC Certification, Dell EMC Guides, Dell EMC Study Materials
1. Energy efficiency
2. Management of substances
3. Preferable material use
4. Product packaging
5. Design for repair, reuse and recycling
6. Product longevity
7. Responsible end-of-life management
8. And the company’s corporate responsibility

For customers looking to ensure they have sustainability covered, EPEAT’s comprehensive approach is an excellent choice. What’s more, organizations find it easy to trust the EPEAT registry because the Green Electronics Council works with third-party assurance bodies to verify product claims. While EPEAT is a voluntary program, EPEAT is extremely important to the U.S. Federal Government and other country governments. Also, we often see requirements for EPEAT registration from customers in commercial RFPs.

A closer look at the energy efficiency criteria


Dell EMC PowerEdge, Dell EMC Certification, Dell EMC Guides, Dell EMC Study Materials
Some of the criteria EPEAT uses for qualifying a product are likely familiar – like the U.S. EPA’s ENERGY STAR program. Dell has a strong focus on power and cooling with all our data center equipment. As part of Dell’s Legacy of Good goals, we committed to reducing the energy intensity of our whole product portfolio (2012-2020) by 80 percent. I’m very proud to say that servers lead the way, already achieving a 74.8 percent reduction in energy intensity.

This is complemented by the Fresh Air 2.0-capable hardware that comes standard with many 14th generation Dell EMC PowerEdge servers, allowing the equipment to run at higher temperature and humidity levels (up to 45°C/113°F at a 29°C maximum dew point); be cooled by clean outside air meeting ISA-71 G1 quality; and, be used in both air-side and water-side economized environments. And, as we know, these energy efficiency measures don’t just serve an environmental purpose – they help combat rising energy costs and can also help you reduce the total cost of ownership for delivering IT. In some cases, you can even save the need for building out new facilities.

Dell’s role in developing the new standard


As Lucian Turk, Sr. Principal Engineer for Environmental Affairs and our EPEAT Advisory Council member, explained to me, “A lot of work by a lot of dedicated people went into creating this new standard. It sets the bar for sustainable procurement tools very high and will give purchasers a clear window into the sustainability of the servers they are interested in.” He should know – Lucian played a major role on Dell’s behalf in helping develop the new standard.

Dell EMC PowerEdge, Dell EMC Certification, Dell EMC Guides, Dell EMC Study Materials
Dell has been involved all along the way with the Green Electronics Council, helping develop their other standards as well. We believe globally harmonized standards on sustainability help move our whole industry forward. We have encouraged the adoption of criteria like the use of closed-loop plastics, recycling criteria, and social dimensions of the supply chain and production process.

The Green Electronics Council requires a Voluntary Consensus Process for developing standards, emphasizing openness, balance, and due process, among other principles. Their commitment to openness means a truly multi-stakeholder approach, involving sustainability advocates, manufacturers like us, purchasing professionals, government agencies, academics, recycling providers and policy representatives. This is another reason EPEAT is such a comprehensive tool.

Putting EPEAT to work for you


As the world begins shifting to a more circular economy, manufacturers need to innovate. Dell’s strategy is to look across the whole lifecycle of our products – from design to end-of-life and everything in between – in order to ensure we meet customer expectations and sustainability goals. It’s because of this focus that you can find so many products listed in the EPEAT registries.

You can search the new EPEAT registry for servers by selecting the server category or by searching by geography or by manufacturer.

Computer Vision and Machine Intelligence: Advancing the Human/Machine Partnership

As we head into VMworld, hot on the heels of a successful Dell Technologies World, it seems like an appropriate time to provide an IoT update.

Dell EMC Guides, Dell EMC Study Materials, Dell EMC Certification, Dell EMC Tutorial and Material

As you know, last October, we unveiled our vision and strategy, a new Dell Technologies IoT Solutions Division as well as IoT-specific incubation projects including Project World Wide Herd for federated analytics. This project has now entered a formal technology preview for VMworld as it approaches production launch under a new name to be unveiled soon.

A lot has happened since that announcement so settle yourself comfortably for a long but hopefully interesting read!

Engineered solutions, bundles and evolving IoT offers


I’m happy to report that our new IoT Solutions Division is fully operational. The division’s core charter is to engineer IoT and edge computing solutions that combine the power of the Dell Technologies hardware and software infrastructure portfolio with partner assets for specific use cases. Our ultimate goal is to make it easier for our customers to realize business value at scale.

In parallel, we’re also working to enable the channel by collaborating with partners to develop easy-to-consume and deploy solution bundles. Examples launching soon include solutions for cold chain retail, Data Center Infrastructure Management (DCIM) and remote monitoring of Oil and Gas field assets.

In terms of existing purpose-built IoT offerings from throughout the portfolio, we continue to find new applications for our Dell Edge Gateways, with customers valuing their rugged utility combined with the global scale and support of a Tier 1 manufacturer. A few months back, VMware launched version 1.1 of Pulse IoT Center, which addresses the often overlooked function of being able to remotely manage heterogeneous things and gateways at scale.

Surveillance opening a view into much broader potential


All good stuff, and to coincide with the delivery of our first engineered solution for video surveillance, I want to put this announcement within the context of the broader Dell Technologies IoT/Edge vision and associated roadmap.

Surveillance is a use case within our broader computer vision strategy – the first milestone for our vision-vision, if you will. And so, this first engineered solution – while important for video surveillance – is also significant in the greater scheme of things.

The bottom line is that we see computer vision as a foundational enabler for many IoT use cases – after all cameras are one of the best sensors around. Further, applying analytics to these data feeds enables customers to more cost-effectively monitor events in the physical world and automate decision-making.

Our aim is to enable computer vision in a variety of use cases in addition to classic surveillance so that customers can “see more than meets the human eye” (enter Transformers theme music) and receive alerts and summaries based on important context.

But wait there’s more!

One foundation, different workload themes


As part of our roadmap of engineered solutions, we’re investing in a single Dell Technologies infrastructure foundation comprised of hardware and software, spanning the edge to the cloud and addressing the needs of OT and IT on Day 0, 1 and 2 (before, during and after deployment). This work will be done against two key workload themes – computer vision and machine intelligence.

Each component in this loosely-coupled infrastructure offer is engineered to work together to optimally support these two themes, depending on which elements are included and dialed in alongside value-add from our partners. Even though it’s ultimately one foundation, we distinguish between the two themes as each track has slightly unique functional properties that require different combinations of tools from the overall Dell Technologies portfolio and partner ecosystem.

Computer vision workloads


Computer vision workloads are enabled by cameras and imaging sensors (including thermal and infrared). They generally require different types of analytics tools than structured machine data, have inherently heavy “northbound” content flow, and as a result drive high compute and storage needs by default.

No surprises here – I challenge you to find someone who thinks it’s a great idea to blindly stream 4k video over the internet compared to snapshots of critical events – at least, someone who doesn’t represent a telco, ISP, or public cloud provider who wants to move and/or store your data for a price, only for you to pay again to get it back!

Traditionally with vision-based solutions, “southbound” data flow has generally been limited to driving displays and speakers plus PTZ (Pan-Tilt-Zoom) control for motorized cameras, but more below on how this is changing with advanced robotics and autonomous vehicles.

Machine intelligence workloads


Machine intelligence, on the other hand, involves structured telemetry data from control systems and sensors, including those that can provide data that cameras cannot.

Of course, image-based sensors can provide extremely rich information about the physical world – not just video in the purest sense but also serve to detect attributes like acceleration and motion, position of a robotic arm, wobble of a motor shaft, temperature and visible gas emissions. However, last time I checked cameras can’t measure parameters like voltage, current, pressure or oil particulates inside of a sealed engine block.

Simple telemetry-based sensors are also important when low power operation (and especially long battery life) is required. Finally, in some cases, it’s not feasible or desirable to install a camera to detect certain conditions due to privacy concerns. For example, how would you feel if a camera was used as a motion sensor in your bathroom to turn on the lights?

Another more universal consideration for machine intelligence is the notion of control, which drives unique requirements to address latency and predictability – ranging from “hard” real time (e.g. super low-latency and deterministic, as in deployment of your car air bag) to “soft” real time (e.g. reactions in seconds to minutes, if not more, and tolerant of slight variances in response time). Granted, vision-based systems that perform in hard real-time are increasingly a necessity in robotics and autonomous vehicles.

Pi and the Sky


I like to talk about how we’re in the “AOL stage of IoT”, which puts developers in what I call “Pi and the Sky” mode when it comes to machine-based data – simply connecting their sensors or PLCs to some sort of Raspberry Pi-class device, and in turn to the public cloud, simply to get started because it’s cheap and easy.

While it’s widely accepted that a shift to edge computing is a necessity to support the sheer amount of data coming online, developers just getting started with capturing basic telemetry data from sensors don’t realize they have an edge data problem… yet. Accordingly, these deployments are often first addressed at the thin gateway edge and the cloud, unlike computer vision where data saturation happens pretty much immediately.

In short, the time scales for the adoption of edge computing at scale are different between computer vision and machine intelligence even though the same foundational computing infrastructure elements can apply for either theme.

EdgeX Foundry – enabling an open ecosystem for machine intelligence


Looking outside of the company, we see our investments in the EdgeX Foundry project as key to enabling the machine intelligence theme in our solution roadmap in a very open and scalable way.

There’s a lot of detail about the project online so I’ll keep it short here, but in case you haven’t heard about EdgeX, it’s an industry-wide vendor-neutral open source project, hosted by the Linux Foundation and focused on facilitating open interoperability between commercial value-add solutions at the IoT edge. The idea is to maximize customer choice, making it easier for them to achieve faster ROI with their IoT deployments.

Even though EdgeX is a hardware and OS-agnostic framework versus an OS itself, one way I describe it is that it’s slated to do for IoT what Android did for mobile – create an open marketplace of interoperable devices and applications.

I’m proud to have been part of the team at Dell that first seeded the project with code in April 2017 that was developed over the course of two years with lots of feedback from partners and customers. The project has since taken on a life of its own in a growing community effort.

Freedom of choice, but we’ll provide you with some great, interoperable choices


Continuing the topic of customer choice, each business unit across Dell Technologies has its own offers that are either IoT-specific or very relevant to IoT, and more generally-speaking edge computing. Our strategy is to render each of these offers as independently-valuable but also better together when used in integrated solutions combined with choice of third-party value-add. Bottom line, we’re all about providing customers with choice and flexibility, not just now but into the future.

In order to enable this, we’re not only contributing to the EdgeX Foundry project as a community member but also leveraging the EdgeX framework internally to help federate our own solutions portfolio. Picture a set of building blocks with super-flexible, open glue, where customers can use all the enabling components together or separately in a mix and match approach.

Getting past the “AOL stage” to IoT scale and advanced class


The loosely-coupled nature of the EdgeX framework enables customers to have consistent tools for data ingestion, security and management, regardless of what devices they use combined with their choice of on-prem and cloud applications. We believe the notion of decoupling – 1) the “infrastructure plane” from the “insights plane”, 2) the edge from the cloud and 3) domain knowledge from technology – in an open, interoperable fashion is the only way IoT can scale longer term in an inherently heterogeneous, multi-edge and multi-cloud world.

In fact, we anticipated the shift to edge computing when we started accelerating our IoT investment back in 2015, building on top of 20 years of experience serving embedded OEM customers. This is why we led with our purpose-built Dell Edge Gateways in addition to spinning up the internal Dell Project Fuse effort, which turned into the open source EdgeX Foundry project of today.

EdgeX goes commercial


Earlier this summer, Dell Technologies Capital also invested in IOTech – a vendor-neutral “Red Hat” of EdgeX that is commercializing the code as its core business model. IOTech’s first offering “Edge Xpert” will enable customers that want to benefit from the open EdgeX ecosystem to invest in their choice of plug-in value-add rather than having to expend resources to support the open source EdgeX baseline.

Further, IOTech is building a second commercial variant of EdgeX that will serve hard real-time use cases while using the same APIs at the perimeter so customers can re-use device and application services. This has the potential to change the game in the automation world.

Important to note is that while EdgeX is a key step towards enabling machine intelligence in a very open and scalable way, the framework is not suitable for video analytics today. That said, the community is in the process of adding support for binary data at the request of Hitachi Vantara at the most recent public face-to-face Technical Steering Committee (TSC) meeting. This will make it capable of passing through certain types of image data in a variety of use cases.

Committed to open standards for distributed computing


Overall, enabling distributed computing based on open standards is fundamental to our IoT strategy. We have been active participants in the Industrial Internet Consortium (IIC) and Open Fog Consortium over the past number of years and have recently joined the Automotive Edge Computing Consortium (AECC), which will address considerations for both computer vision and machine intelligence in the automotive space.

Advancing the human/machine partnership


Now that I’ve presented all the elements, let’s take it a step further – especially powerful is leveraging computer vision combined with machine intelligence to provide people with even richer automated insights into the physical world!

For example, think about pointing a camera at a manufacturing conveyor belt to inspect the quality of parts flying by while simultaneously ingesting data from the PLCs and sensors on the machinery producing those parts. By analyzing the mashup of this data, both the production supervisor and quality engineer would know what was happening with the machines that produced the part when the cameras detected the flaw. [Side note: “machine vision” is also a term used in this space, which is basically computer vision applied in manufacturing use cases such as quality control, process and robot control, etc.]

This paradigm applies to nearly all other verticals and use cases in a similar fashion. In another example, a building facilities manager may merge event data from surveillance with machine data from devices like badge readers, motion sensors, beacons and thermostats.

AI and context-based reasoning


Of course, applying analytics to drive outcomes is a key part of computer vision (and machine intelligence for that matter). Minor rant first – I always get a kick out of those that talk Artificial Intelligence (AI) when they’re really just working with a basic IFTTT (If This Then That) rules engine.

The promise of AI is, of course, true context-based reasoning. In other words, it’s about making sound judgement calls in a real-world situation with many inputs and many different potential outcomes. Ethics and morality come into play too but this post is shaping up to be long already so I’ll save those topics for another time.

It turns out it’s rather simple to tell a bad part from a good part on a manufacturing line with the right camera resolution and analytics tools – because it’s highly predictable (using the old “one of these is not like the others” trick).

It’s also pretty easy to retroactively search recorded video with tools that classify readily distinguishable objects, based on prescribed context. Like “’show me everyone with red shirts in that area between the times of 1:00 and 1:15 last Wednesday” or “show me all the pink cars that turned left at that intersection today” (hopefully not that many!), and so forth.

Further, it’s fairly rudimentary these days to train a model to tell the difference between a car and a bicycle. The ability to recognize specific faces and even gauge relative demographics (e.g. age, gender, race) with no prior knowledge of the individual is also getting quite (eerily) good.

Animals tend to be a little trickier, and while I wouldn’t be too impressed if your algorithm could pick out a giraffe in a field of hyenas, you’re getting warmer if it can tell a cat from a dog and you’re really starting to impress me if it can distinguish minor differences between animals in any given breed.

It’s all possible with the right image resolution, compute power and model training. It’s just that data scientists have been a little more focused on identifying humans than animals in the wild kingdom (but more on that below).

Making a judgement call


Diving deeper into context-based reasoning – how do I know that person who may look a little shifty is really up to no good? For example, if an algorithm detects a person leaving behind a bag in a crowded public space, did they purposely drop off something that can do some harm or did they just accidentally lose their gym clothes? It turns out making these calls accurately isn’t so easy. Where are the pre-cogs from the movie Minority Report when you need them?

To avoid a false positive while making a proactive judgement call about a theft, I would need to know things in the moment like that the suspected offender has a criminal record with the authorities, or at least have a record of past questionable behaviors in my store (via my private database) – and be able to analyze a collection of this historical behavior together with real-time context to definitively predict the theft before it happens.

It’s tricky and we’ll likely have as close to as many false positives as with humans, but nevertheless, AI will increasingly help us automate these judgement calls. Maybe not like a pre-cog hanging out with Tom Cruise but at the very least, catching crooks red-handed on the way out the door.

The true power of IoT is triggering actions in real time across a variety of interrelated use cases and contexts. It’s about a system of systems and networked intelligence. And part of this networked intelligence is the notion of combining sensor-based data with business data, whether it be your ERP, CRM, social networking tool de jour, or otherwise. Within the Dell Technologies family, Boomi can help with this data fusion!

Retail customer tracking… wait make that trending


Looking at automating judgement calls through sensor-driven analytics in a different context, how do I know that person who just walked into my retail store is a big spender, meaning I should summarily roll out the red carpet? [Aside: I found it funny at the National Retail Federation (NRF) event a few years ago when apparently the industry decided to no longer call following customers’ patterns in stores “tracking”, rather rebranding it as customer “trending”, after all the latter does sound a little less creepy.]

In case you didn’t know – a sophisticated retailer knows for a fact that you stood in front of that end cap for 46 seconds debating but not purchasing those cookies on special. All because, you actually went to the freezer aisle to grab a pint of Ben and Jerry’s Chunky Monkey ice cream instead… at full price… for the third time this week!

Kind of creepy, right?

Privacy goes out the window with sufficient value


However, I find that as much as people talk about privacy concerns, it all goes out the window if sufficient value is received. If I told you ten years ago that you’d leave location-based services (LBS) on your phone all the time, you would have thought I was crazy. But, guess what, the majority of people with smart phones today do just that. Where else do you think all those red and yellow traffic lines come from? (Waze of course is the ultimate opt-in way of capturing this data). And then, there’s the always-on Alexa’s of the world.

It’s also about context – you know when you shop for something online and then your life becomes all about that product for like three months after? Literally, every single nook and cranny of your browser is an incessant carousel of that product. Did you forget something? No, I didn’t forget it – I chose not to buy it!

However, here’s where context matters. As much as I find that web phenomenon supremely annoying, let’s be honest, if I were to walk into a brick and mortar retail store (which means by the way that I just signaled strong intent to buy something), I sure would be happy to receive a personalized coupon on the spot. Even though it would still be a little creepy that they knew it was me that walked in, and that the coupon was for Ben and Jerry’s ice cream.

Using our AI powers for good


Sure, we’ll have some false positives along the way and privacy will unfortunately be violated (both unintentionally and intentionally) from time to time. Of course, GDPR is working to address this, more on that another time. However, ultimately technology is about driving human progress and as long as we use our powers for good not evil, I think we’ll be just fine.

Speaking of the wild kingdom and doing good, as part of their “Data for Good” program, one of our analytics partners SAS® is collaborating with the non‐profit WildTrack – an organization that uses non‐invasive techniques to monitor and protect endangered species.

With the help of SAS® technology – built on top of Dell PowerEdge server infrastructure – WildTrack is using extensive data on endangered species like cheetahs and leopards to improve conservation efforts by collecting footprint images and analyzing them with a customized statistical model in SAS to gain insights into density, distribution, and habits of these species – their software can identify species, sex, age-class and individual animals from their footprints. WildTrack researchers are also using Dell Latitude Rugged Laptops at their field sites in Namibia for managing their data.

Given enough data, with SAS deep learning, AI models can be trained to perform human-like tasks such as identifying footprint images and recognizing patterns in a similar way to indigenous trackers in Africa – but with the added ability to apply these concepts at a much larger scale and more rapid pace. WildTrack has developed algorithms to monitor a range of species from Amur tiger and Giant panda in China to Black rhino in Namibia, to Mountain lion in the Americas, and in partnership with SAS is actively expanding their program to roll-out more innovative techniques in the future.

This is very cool stuff, and of course, even more interesting is when you combine all of the above with emerging technologies like blockchain and AR/VR, but more on that in future blogs!

From DVR to automated real-time insights


Now to be fair, computer vision has been around a long time – in the same way that many use cases that are now called IoT have. However, as with IoT, we’re at an inflection point where available technology, not to mention an ever-increasing demand for real-time data, is making these trends accessible to the masses and soon to be pervasive.

In the case of computer vision, this is especially due to the advent of better tools including drag and drop interfaces to train models (almost “Pinterest like”) and co-processing via GPUs and FPGAs to greatly accelerate analytics workloads. While we’re getting closer to the art of the possible, we have to stay vigilant for issues with privacy and false positives that I spoke about earlier.

And yet, today, it’s true that something on the order of 90% of surveillance deployments are really fancy DVR at best, with best-in-class analytics being retroactive context-based search of archives after an event such as a crime.

Surely you’ve seen a TV or movie scene with some security guard eating out of a bucket of fried chicken while staring at a wall of fuzzy 8” black and white CCTV screens. This has been the norm (perhaps minus the KFC) for many years. However, it’s getting cost prohibitive to put people in front of screens to try to capture critical events in real time, plus in order to drive new business outcomes (such as offering a customer a deal when they walk into your store) you need to act in the moment. Here’s where computer vision kicks in.

A clear path to innovation


We’re seeing more and more sensors coming online in general and the introduction of 4k video is enabling new analytics-driven outcomes because of the more granular detail that can be captured from the physical world compared to traditionally lower-resolution CCTV cameras. Like detecting a license plate number or specific face from a long distance, or a slight bulge in someone’s jacket that wasn’t there when they walked into the store, or that it was actually Cherry Garcia and not Chunky Monkey ice cream that I grabbed.

We’re also seeing increasingly creative usage of drones equipped with sophisticated sensing capabilities, not only outdoors for inspection of infrastructure like oil pipelines and bridges but also indoors to take inventory in a warehouse by flying around after-hours scanning barcodes on packages.

Shallow learning


And with all of this vision-based data, we’re seeing deep learning techniques move closer and closer to the device edge in addition to more and more silicon purpose-built for AI. Another recent Dell Technologies Capital investment was in Graphcore – a startup developing AI-optimized silicon that can be used for both learning and inference.

Whether Graphcore’s processor sits at the edge (for example, in an autonomous vehicle), the cloud or somewhere in between simply depends on use case. We’re seeing similar investment and M&A activity across the market, including Intel’s acquisition of Movidius.

The deepest of deep learning will always happen in the cloud where we effectively have infinitely scalable compute but we’ll continue to see more and more models being pushed to and even trained at the extreme edge. I jokingly call this “shallow learning”.

Invest now so you don’t pay for it later


All of the use cases I’ve talked about and more are driving new architectures including an increasing need for edge compute. Net-net, even in the case when a customer may think they’ll be fine with pumping data to the cloud for a simple IoT use case or leveraging fancy DVR for surveillance with brute-force reactive search, we advise them to invest in more compute headroom at the edge now.

My message is – don’t get caught later having to rip and replace infrastructure in order to drop in new AI workloads for real-time analytics. You need to equip yourself now to stay competitive as the world transforms around you.

One infrastructure for distributed computing


In a perfect world, a user who is leveraging drones to capture high-resolution aerial imagery would love to be able to fully process that data in the drone itself, but unfortunately the laws of physics can be pesky at times.

However, at the very least they’ll want to increasingly pre-process that data in the drone, then send smaller data to a server cluster in a nearby field base station (perhaps in a truck or Modular Data Center) for further processing, and finally backhauling only the most meaningful information to the cloud for additional processing and archiving. This is compared to sending huge image files directly over a cellular, or worse satellite connection to some distant cloud.

And in any event, with GPU- or FPGA-based acceleration compared to processing with a traditional host CPU, an image that once took weeks to process can now take days, if not hours.

In all cases, our goal at Dell Technologies is to equip customers with the right scalable, secure, manageable and open infrastructure today so they can grow into new use cases over time, simply by adding more devices and pushing new applications via containers or VMs to any compute node spanning the edge to the cloud.

We build the guts so our partners can bring the glory


Important to stress again is that at Dell Technologies, we’re all about the underlying infrastructure. The last thing we want to do is compete with partners and customers that were deploying IoT use cases before it was called IoT.

As such, we’re curating an ecosystem of Technology and Services partners that provide domain- and use-case specific offerings on top of our engineered infrastructure foundation to deliver complete outcome-oriented solutions for customers. Our foundation will be equally applicable to scale out any use case when paired with the right partner software, hardware and services, and EdgeX provides an open way to facilitate increasing plug-and-play interoperability across the board over time.

Simplifying surveillance as the first stop on the vision train


My colleagues in the Dell Technologies IoT Solutions Division have done a fantastic job pulling together our first targeted solution for surveillance through close collaboration with the Dell Technologies portfolio Business Units, our analytics and camera partners and the open source EdgeX community. This will continuously evolve into a foundation supporting myriad use case-specific solutions spanning both the computer vision and machine intelligence themes.

As part of this effort, we have validated various sizings of the combined solution in our labs and made sure that it’s not only simple to deploy but also highly reliable. While there are great benefits of virtualizing compute, networking and storage for scalability and performance, we also want to make sure that we offer rock-solid uptime so customers don’t experience service dropouts in critical moments.

History repeats itself


I’ll close with a story of how I believe computer vision is taking a similar path to the POTS to VOIP (Plain Ol’ Telephone System to Voice over IP) transition in the enterprise.

Looking back, despite OT (Operations Technology, e.g. facilities within a building) traditionally owning the phone system, the business ultimately directed IT to take on the support of emerging VOIP technology due to the savings involved and flexibility gained.

Similarly with video, OT has historically owned CCTV systems but the technologies involved with where we’re headed (compute, storage, networking, AI etc.) are driving a need for IT skills to deploy and manage computer vision solutions in scale.

As I like to say, IoT starts in OT but scales in IT.

On that note, I talk quite a bit about the importance of OT and IT convergence, including jokingly highlighting how this trend makes for the preeminent IoT conference Venn diagram.

At Dell Technologies, we’re here to work with our partners and both OT and IT organizations alike to make this transition as smooth as possible so customers can benefit from entirely new business outcomes in spades.

Tuesday 28 August 2018

It’s Now Possible to Run Unified Storage in the Cloud with Dell EMC Unity VSA Cloud Edition

All customers, large or small enterprises, firmly established or relatively new, are deploying storage and servers in new and innovative ways; far differently than a just few years ago.  Today, most IT environments are highly virtualized and leverage the cloud for cost-optimized data placement. For Midrange customers, our new Dell EMC Cloud Edition software helps drive new flexibility with the cloud and virtualized infrastructures. With even more features for the cloud, and more options to deploy Dell EMC Unity as SDS (Software Defined Storage), converged systems, or as traditional hardware, Dell EMC Unity is the ideal platform for all VMware environments. And with this release, we deliver even more options to deploy efficient, simple and flexible storage.

Dell EMC Certification, Dell EMC Learning, Dell EMC Cloud, Dell EMC Study Materials, Dell EMC Live

Dell EMC’s Unity platform is the number one Midrange storage platform in the market. We’re listening to you, and based on your feedback, we’ve added numerous new features. Recent innovation includes in-line compression and deduplication, MetroSync, and a wide range of security certifications help our customers deploy Dell EMC Unity over the widest range of workloads possible. Our customers also use Dell EMC Unity on purpose-built hardware, engineered converged infrastructure platforms and software defined storage.

In this version, we’ve added a diverse set of cloud-related features that leverage the scale and ease of access of the cloud. Many customers look to merge the flexibility of a software defined storage with the scalability and flexibility of VMware Cloud, and we’re delighted to now provide those features.  Dell EMC Unity platform users now have a virtual data center which expands outside the confines of traditional physical space, and VMware Cloud users now have access to the extensive host of Dell EMC Unity file services, data protection and storage management features.  Every type of Dell EMC Unity deployment benefits from more options and more flexibility.

Dell EMC Unity VSA Cloud Edition


Dell EMC Certification, Dell EMC Learning, Dell EMC Cloud, Dell EMC Study Materials, Dell EMC Live

Customers can deploy Dell EMC Unity either on traditional hardware or software with Unity VSA (virtual storage appliance), and the features are identical. The Dell EMC Unity VSA Cloud Edition provides a unified storage management environment with block, file and VVols support, as well as snapshot, replication and scheduling services.  And the intuitive user interface, now based on HTML 5, is identical for every version as well, so all users are immediately at home and productive.  And we provide full integration for VMware environments including VSI and ESI plugins, and VAAI support.

At Home in the Cloud


The Dell EMC Unity VSA Cloud Edition is deployed and certified on a standard VMware ESXi server.  With VMware Cloud available on AWS (Amazon Web Services), the cloud-based version of Dell EMC Unity is deployed quickly, ideal for test and development, and as a cost-effective replication destination.

Dell EMC Unity VSA Cloud Edition also brings its robust file services to the cloud.  Providing more flexibility and portability, the VMware Hybrid Cloud Extension enables large-scale data migrations between cloud environments and the users’ on-premise Dell EMC Unity hardware and virtual environments.

Dell EMC Certification, Dell EMC Learning, Dell EMC Cloud, Dell EMC Study Materials, Dell EMC Live

Dell EMC Unity customers seeking cloud deployments for their business now have a path without learning anything new – it couldn’t be easier. The same Dell EMC Unity operating environment and intuitive user interface are now available in all environments, in the cloud or as traditional hardware. For those adopting the cloud today, or those considering multi-cloud in the future, Dell EMC Unity is ready on your timetable, so you get to pick the timing and the product version that best meets your needs.

The Road Ahead


According to IDC, the total spending on IT infrastructure for the cloud is $52.3B and growing at 10.9% annually.  Whenever you’re ready to jump in, we’re ready too.  With the new Dell EMC Unity VSA Cloud Edition, both first-time cloud users and more established multi-cloud IT shops will benefit from the feature-rich capabilities in Dell EMC Unity, now optimized and packaged for the cloud.

Saturday 25 August 2018

Get 2018 Off to a Flying Start with the Launch of Dell’s New Latitude 2-in-1s

Dell EMC Study Material, Dell EMC Certification, Dell EMC Tutorial and Materials

It’s the start of a brand-new year and we’re starting it off with a bang via a whole new range of mobile solutions to enable your Dell Client Solutions revenue growth.

With a focus on the ‘On-the-Go Pro’ persona, we’re driving our new 2-in-1 range to take advantage of this growth area in the notebook market. However, the PC sale is no longer just about selling to the IT department; we need to influence the users as well as the budget owners.

That’s why we’re taking a two-pronged approach to our messaging, to help you effectively close sales in the first quarter of the year. Each approach focuses on the core value we can deliver to employees. Messaging to the IT decision maker is about ease of deployment and how these highly portable solutions are incredibly secure. The messaging to users is all about how these new devices can improve productivity.

We’ve integrated services and accessories into this messaging to make it easier for you to attach, attach, attach and add even more value to your customers.

Dell EMC Study Material, Dell EMC Certification, Dell EMC Tutorial and Materials

The Rise of the 2-In-1


According to IDC’s Worldwide 2017 Q3 Personal Computing Device Tracker report, convertibles are taking off among business users, growing at 33.6% worldwide year-on-year.*

Simply put: There’s never been a better time to help your customers plan and deploy their device upgrade strategy, based on decisions that help manage cost and time, while increasing productivity and flexibility.

The Perfect Companion for the ‘On-The-Go Pro’


Whether it’s cab riders or frequent fliers, the ‘On-the-Go Pro’ employee segment is growing rapidly. These users demand access to people, programs and data from anywhere, and want lighter, more portable devices that won’t slow them down. Not only that, but they need devices that allow them to rocket between apps and emails while multitasking quickly and securely, without getting tripped up on bulky cords or low battery power.

You can enable them by powering up their productivity with greater portability and connectivity. Combatible with the Intel® vPro™ platform, that is built for business, our lightweight convertible 2-in-1s offer super reliable and wireless mobile broadband connection to the office, the network and the team – no matter where they are. And with 39% of mobile workers juggling multiple devices** on the move, deploying 2-in-1s into the workforce basically turns the whole world into a highly productive workplace. Plus, we have the perfect compact accessories to reduce IT clutter and maximize productivity:

◈ Hybrid Power Companion acts as a power source for the 2-in-1 at the desk and a power bank for the 2-in-1 and smartphone when away.
◈ USB-C Mobile Adaptor keeps them connected, providing universal video & data connectivity and keeping the cable tucked out of sight.

Dell EMC Study Material, Dell EMC Certification, Dell EMC Tutorial and Materials

Making It Easy for IT


When your customers are ready to make the switch, we can assist you with Dell ProDeploy Plus. All of our new Dell Latitude convertible 2-in-1s and notebooks can be imaged straight from the factory, so your IT can spend more time on value-add projects and less time on PC deployment.

But it’s not just ProDeploy Plus that helps fast-track their deployment, our new Dell Remote Provisioning Tool can turn on their Intel® vProÔ platform in less than an hour. That means that they can take advantage of optimal out-of-band management, including:

◈ Updating one time for multiple systems, allowing significant time reduction for remote manageability
◈ Remote BIOS management
◈ Remote hard-drive wipe
◈ Remotely adjusting battery settings

IT and end users alike will experience all the performance, security, manageability and stability vProTM offers, in a fraction of the time it normally takes to set up.

Dell EMC Study Material, Dell EMC Certification, Dell EMC Tutorial and Materials

Worry-Free Security


Security concerns are most likely top of mind among IT leaders when it comes to their mobile users. Give their workforce the mobility they want, and the peace of mind the IT department needs, knowing devices and data are safeguarded with Dell EMC’s built-in security options.

Your customers can get even more worry-free security with Intel® Authenticate, which allows you to define a customized PC login policy requiring two or more proofs of identity before a user can access a given computer. Together with optional Windows Hello with infrared camera facial recognition to our 2-in-1s for increased security that keeps mobile users protected and productive – no matter where they go.

Get the Message Out


We’ve built a set of marketing assets that will help expand your revenue potential by putting you at the forefront of your customer communications when explaining the benefits of an end-to-end PC refresh with the new Dell Latitude 2-in-1s leading the way.

You can add your own personal touch to these marcom assets for deeper engagement, while the quick-reference call scripts provided will equip your sales force with the right answers when engaging customers.

Thursday 23 August 2018

Inspired by OEM Customers, Dell EMC is First to Market with 1U Rack Workstation

In my book, delivering a great customer experience is all about listening and responding. You’re never done and dusted – you have to keep your ears wide open and stay tuned as each customer’s needs keep evolving.

Dell EMC Guides, Dell EMC Certification, Dell EMC Study Materials, Dell EMC Tutorial and Materials

For example, some of our OEM customers told us that while they loved the power of our Precision 7920, a 2U rack, dual-socket platform was often more than they needed in terms of performance, scalability and price. Bottom line, they felt that they were paying for functionality they didn’t always need.

Size matters for OEM customers


Size also matters. Unlike Texas, where bigger is usually better, OEM customers need small form factor products without compromising on performance. Why? Well, they put these systems in seriously tight spots that an IT team in a data center would never dream of. And so, in direct response to OEM customer demand, we decided to design a 1U shorter-depth, rack workstation product.

Welcome the Dell Precision 3930


Fast forward to today and we’re delighted to welcome the brand new Dell Precision 3930 – the first 1U rack workstation available in the market from a tier one vendor, offering the reassurance of data center security and cloud access with affordable price/performance!

Inspired by our OEM customers, this now forms part of the standard Dell EMC workstation product portfolio. And we know from initial reactions, our OEM customers just love the Dell Precision 3930 as it has been designed with them in mind.

Ideal for tight rack environments


As the saying goes, the best of goods come in small packages. Powered by the 8th generation Intel Core and Xeon E3 processor, the Dell Precision 3930 comes in at 22” versus the standard 24”depth rack.

This is ideal for tight rack OEM environments. Picture a control workstation for failure analysis in an industrial automation setting, a robotics control station on a manufacturing floor or a sports stadium, running a video surveillance system. The Dell Precision 3930 fits all these use cases. And of course as it’s a 1U design, it’s perfect for small compute manufacturing solutions.

Some don’t mind it hot!


On selected configurations, the Precision 3930 can comfortably cope with extended operating temperatures of up to 45C. This means that the product can be put to work where the work is actually happening, say the hot factory floor versus needing to be set aside in a cooled data center. And of course, optional dust filters are available for extra tough environments.

Optional redundant power supply for failover


Typically, optional redundant 550W power supply units are provided on servers but not on client products. The Precision 3930 bucks that trend by offering an optional redundant power supply.

Each power supply unit is capable of powering the entire workstation. In the event of failure, you have the reassurance of knowing that the second power supply will automatically kick in to keep the workstation operating without any interruption. The switch over is completely seamless.

Important for the OEM customer experience


This reassurance is especially important for situations where failure is not an option. Think about it. You certainly don’t want a power failure in a situation where a doctor is performing a cancer diagnostics scan for a patient.

Support for double-wide, higher-end graphics & remote power switch


So, what else does this little beauty have up her sleeve? Despite its compact size, you can also enjoy higher-end graphics as the 3930 supports double-wide GPUs. Again, this is especially important for our Medical and Healthcare-focused OEMs. Meanwhile a remote power switch is ideal when the 3930 is embedded in an Industrial Automation or Medical Diagnostics solution or kiosk.

Client OS rack workstation


In the past, some of our OEM customers have purchased a 1U rack mount server and would then re-engineer it to support a non-Dell supported Client OS. If you’re in this category, the good news is that you now have a cost-effective workstation solution that perfectly fits your needs.

Long-life product cycles


Finally, we know from experience that our OEM customers value stable product platforms. With our OEM XL program, we guarantee that there will be no change to the processor, CPU or motherboard part number of the 3930 XL for the life of the platform. And because we understand that qualifying, testing and certifying your next generation solution takes time, we provide an extended transition period of 18 months from the launch of new, next generation platforms. You also get a minimum of three months’ notice on changes to secondary components, like graphics cards and hard drives.

Focus on innovation and not technology churn


This stability allows you the time and space to focus on new innovations versus dealing with certification and constant hardware churn. Thanks to this long-term product roadmap, you can also plan and allocate your budget more effectively.

Perfect solution for OEMs


In summary, I believe that the Dell Precision 3930 is the perfect product for OEMs that either require off-the-shelf, rack-based compute devices for solution integration, kiosks or space constrained designs or alternatively need small compute power with the support of a double wide graphics card, extended operating temperature and long life-cycles.

OEMs – this product is for you! Keep telling us what you need – we’re listening.

Wednesday 22 August 2018

Boost Your SQL: The SQL Server Maintenance Solution Supports Dell EMC Data Protection

Need a simple, free solution that allows you to switch easily from SQL server native backups to Data Domain Boost? You’ve found it: Dell EMC and Ola Hallengren have partnered to add support for our leading de-duplication offering, DD Boost, to Ola Hallengren’s award-winning SQL Server Maintenance Solution. Our technology and Hallengren’s scripts are designed for mission-critical environments and advanced backup scenarios, leveraging the latest features in SQL server.

Dell EMC Data Protection, Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certification

The combination of DD Boost with Data Domain yields impressive efficiency for backup and replication, capitalizing on Data Domain’s ability to reduce storage requirements and address the breadth of today’s data protection challenges. Client-side deduplication enabled by DD Boost speeds backups by 50% and reduces bandwidth requirements up to 98%. Your environments can scale higher, backup and restore faster, and reduce the load on your server during backup. And DD Boost’s networking features for link failover, aggregation and load balancing provide optimal network utilization while ensuring that backup jobs complete.

Dell EMC Data Protection, Dell EMC Study Materials, Dell EMC Guides, Dell EMC Certification

Data Domain Boost: Faster, More Efficient Backup


What’s more, DD Boost gives App owners direct control of backup to Data Domain using Microsoft SQL Server Management Studio. By giving App owners control of the recovery process through native utilities, they do not need to go through Backup Admins and so achieve faster recovery.

Deployed by numerous organizations globally and repeatedly voted Best Free Tool in SQL Server Magazine, Hallengren’s SQL Server Maintenance Solution automates and improves your database maintenance to assist with important, but time-consuming jobs. The solution offers scripts for backups, integrity checks, and index and statistics maintenance. And you can perform tasks independently on the databases you specify.

Dell EMC – the market leader in the Data Protection Appliance & Software Market –  protects your infrastructure investments so you can stay focused on realizing your desired business transformations.

Sunday 19 August 2018

Data Capital: What’s That?

Digital Transformation is creating massive opportunities for technology sellers but customer success is dependent on the data

Dell EMC Guides, Dell EMC Learning, Dell EMC Study Materials, Dell EMC Tutorial and Material

Your customers, their competitors, even the media all know that digital transformation is here, but how does the modern company succeed with it? The hype behind this term, the underlying buzzwords (AI, Big Data, IoT, and Digital Experiences just to name a few) are looked at as the way to lock in long-term success as we move into the 4th industrial age. Yet without a corresponding focus on data, their implementations will be limited or potentially fail.

Why? Simply put data powers digital transformation. We call this data capital which is the value created from leveraging data to drive services or derive insights. The most advanced use cases are heavily reliant on having substantial amounts of data to make sense of. Take building an autonomous vehicle, these companies will need fleets driving hour upon hours and millions of miles before they achieve their goal. Think about the size of the undertaking, but also realize that just a single vehicle in this fleet can generate up to 100TB of data in a single day of driving. If the auto manufacturer hasn’t invested in a data platform that can support the kind of volume we’re talking about they will be limited in their ability to analyze that data.

Organizations are struggling under the weight of their data


Most of the companies you work with have a sizeable data footprint already, and if it hasn’t been consolidated into a unified data lake then that data will have limited applicability and value across their business.

This is your first opportunity: Have a conversation with your customers about how they can leverage Dell EMC Isilon and ECS to consolidate their workloads onto a massively scalable data platform to save money and enable this data to be put to use.

Keeping up with data growth


Unlike traditional companies, those who have gone digital begin to thrive on valuable data and therefore require more and more of it. Over time as more data is accumulated, it can provide more accurate predictions, access to broad digital services, and be used to automate core functions. That all sounds good, but the corresponding costs of maintaining large data sets with aging infrastructure can weaken the position of the company.

Your second opportunity: Help organizations deal with massive data growth by eliminating much of the operational costs and move organizations to the Isilon platform that can be managed by policies.

Extracting value from data


Assuming an organization has gotten this far they now need to figure out how to use the data. As digital transformation has expanded out of technology-driven companies this means assembling a new skill set for most organizations. If they struggle to get the infrastructure in place they won’t even make it to the point of developing complex business logic or enabling their business with an IoT device.

Your third opportunity: Enable your customers to build on top of Dell EMC Unstructured Data Solutions that work with the leading data analytics solutions, support real-time streaming analytics, simplify workflows, and run cloud-native apps.

Help your customers get the most from their Data Capital


Seize the opportunity to be a trusted advisor and help organizations make their digital transformation quickly and safely. As our own sellers have seen, focus on telling the Data Capital story with customers and break out from the bottom commodity storage conversation and focus on delivering value to the organization.

How Cybersecurity Can Unite the CFO and CIO

With five new types of cyberthreats popping up every second, business success is about more than just innovation and growth. It is also about protecting the company’s intellectual property, reputation and shareholder value – and this means incorporating a comprehensive security strategy.

Cybersecurity, Dell EMC Study Material, Dell EMC Guides, Dell EMC Learning, Dell EMC Certifications

Even though CFOs fully understand the reality of cyberthreats and they have witnessed the financial and reputational impact of attacks, they don’t always recognize the need for their involvement in a cybersecurity strategy. But, here again, it is the joint responsibility of the CFO and the CIO to protect the company’s key assets, and that includes the digital ones as well. Only by working hand in hand will they bring cybersecurity awareness to a higher level within their company. Being a CFO myself, and with assuming my share of the responsibility for the company’s assets, I thought I would share some of my experiences with you and explain why such a step is becoming much more than a necessary evil.

Attacks are inevitable


“It can’t happen here.”

This is a sentence I used to hear when visiting customers. But the truth is, we all know now that nobody’s 100 percent safe in the modern age, either on a personal level or from well-publicized, organization-specific ransomware cases like WannaCry (300,000 computers infected) and NotPetya (several well-known multinationals in panic). Add to this daily reports of data breaches involving major retailers, financial institutions, internet companies and even dating sites, and it is not very difficult to understand why individuals and businesses alike are becoming less self-assured when it comes to cyberthreats.

I am convinced that there are only two types of companies: those that have been hacked and those that will be.

“I am convinced that there are only two types of companies: those that have been hacked and those that will be. And even they are converging into one category: companies that have been hacked and will be hacked again,” said former FBI director Robert Mueller, quoted in the Connected CIO booklet from Dell EMC.

Understandably, today’s businesses would prefer to stay off the radar of cybercriminals. Even the most serious banks now play it low-profile. The key is to not tempt hackers, whose favorite techniques now include cryptojacking or fileless malware. In a recent IMF blog, Christine Laguarde estimates the cyber risk for the financial sector, labeled as a significant threat to the financial system. The IMF suggests that average annual potential losses from cyberattacks may be close to nine percent of banks’ net income globally, or around $100 billion. These are staggering numbers, indeed, and do not even cover the worst case scenario. Taking into account that the financial sector has always been one of the most protected segments, this leaves much room for thought about the extent of potential losses in other sectors such as manufacturing. The figures above are based solely on those data breaches that are publicly known. This is just the tip of the iceberg, and I would bet only cover something like 10 percent of the all the real cases.

Traditional ‘product’ approaches not enough


Last year, a leading manufacturing company specializing in personal care was crippled by a huge data breach. They turned to my employer, Dell EMC, to help them build and implement a multi-layer cybersecurity strategy, encompassing everything from data encryption to tape backups and cyber insurances. For years, they had been a bit lax in terms of security, but it turned out that traditional strategies, relying on a collection of heterogeneous products, were no longer enough to cope with the ever-increasing ingenuity of hackers.

Joint custodians


Examples such as this highlight where a strong CFO-CIO collaboration can make a substantial difference. Given that the CFO is responsible for the company’s assets and the CIO is the gatekeeper of the IT infrastructure who makes security happen, they have a joint responsibility to build a comprehensive strategy that relies on more than a few randomly assembled ‘magic’ security products.

1. Keep your friends close, but your enemies closer

This means analyzing all your organization’s vulnerabilities in detail and taking appropriate actions. It starts with very simple and practical solutions, such as making sure employees change their passwords regularly and log off their computers when not in use. CFOs should make sure that sufficient funding must go into workshops, training and communication efforts to raise security awareness company-wide. Do not forget to take social networks into account during this exercise. There are facts employees should never expose on Twitter, Facebook or LinkedIn if they play a role in the security chain, such as holiday times or function descriptions.

2. Get your cybersecurity toolbox organized

Together with a trusted partner on the technical side, CFOs must take a hand in directing the implementation of security tools, data encryption techniques and recovery solutions. One key point in an age where data is the new oil is the ability to prioritize or tier the data that is backed up so as to quickly recover the most critical data in the event of a breach or attack, in addition, the most critical data should have the most secure and frequent backups. And that is right up the CFO’s sleeve, where they can prove their added value to the CIO, both literally and figuratively speaking.

Given that a security strategy will never be 100 percent successful (80 percent of incidents are caused by humans), the essential questions the CFO can help the CIO answer are:

◈ How do I protect the heartbeat of the business if I am the victim of a cyberattack ?
◈ What loss of assets would affect the daily operations of my business if the organization were under attack?
◈ How could we lose consumer confidence?
◈ And what could have an impact on shareholder value and our reputation in the market?

Usually, less than 10 percent of the total data needs to be recovered quickly to avoid major losses.

While the CFO frees up the necessary budgets, the CIO should offer technical advice on the IT choices as well as actually embed the cybersecurity strategy within the daily operations. For any new IT project, the Connected Partnership needs to reflect together on the security risks, finding the right balance between openness and isolation. In our interconnected world, you cannot close all the gates, but you can proactively incorporate the right tools to detect when something goes wrong. By doing so, CFO and CIO will be well-positioned to move from a strategy of detection to one of protection.

To put on my CFO hat for a moment, I confirm that lots of money does indeed flow into cybersecurity and threat prevention. But cutting costs on that post because ROI is difficult to calculate is presupposing a false economy. The risk of investing insufficiently in cyber protection is losing hard-earned goodwill for both your company and your customer. Who would take the risk of cutting costs on the smoke detectors and fire alarms in their office building?