By now you have probably heard the term: edge computing
The industry is buzzing about this cutting edge technology. Put simply, edge computing places computing resources closer to where data is generated rather than sending data back to centralized servers or on the public cloud. Organizations leveraging this technology are typically doing so to improve performance, user experience, security, and reduce costs quite literally giving them the edge over their competitors.

In our recent blog post we highlight several use cases of how edge computing can take advantage of hyperconverged infrastructure (HCI). This shift towards modernizing integrated systems is helping organizations achieve operational excellence. For orgs that may not have dedicated IT pros to set up, deploy, and manage this new infrastructure it’s still possible to initiate through edge-ready HCI (and, if you’d like some examples, feel free to take a look at this resource, here .)
I’m curious about what you think. Does edge computing/HCI make sense for your organization at this point? Has it been on your radar to implement further in the future? Why or why not?
Does edge computing + HCI make sense for your organization?
8 Spice ups