edge-computing
How NearDB Aims to Solve Edge Computing’s “Data Dilemma”
mm
Leo Farias
Posted on: January 22, 2019
Development
Tags: data edge computing enterprise solution NearDB
Tags: data edge computing enterprise solution NearDB

NearDB is an emerging tool designed to provide an enterprise-oriented solution to computing’s latency issues.

As the presence of smart devices in everyday life grows, edge computing gains importance as a way to build fast, responsive applications.

There’s a problem, though. Applications that run on the edge but access data from the origin still create high network latency.

Developers have a bag of tricks for getting around this, and there’s another on its way: NearDB. Created by Concepta’s own Leo Farias, NearDB aims to provide an enterprise-oriented solution to computing’s “data dilemma”.

What is Edge Computing?

Edge computing is exactly what it sounds like: computing that happens near the client.

Specifically, computation is done mostly or entirely on distributed device nodes rather than being routed back to a centralized cloud.

It represents a shift in thinking from the “all-cloud” trend.

The change is driven by the rising popularity of IoT and smart devices, which demand both rapid calculations and a high quality user experience.

What Does Edge Computing Offer?

Speed is the main reason developers turn to edge computing.

Faster response leads to better user engagement, high retention levels, and more conversions.

It’s key to creating technology that helps users rather than being another source of frustration.

To understand why edge computing is so much faster, think about network latency.

Network latency is the amount of time required for a message to travel from a sender to the intended receiver.

It’s a function of distance over the speed with which the signal propagates.

Data transport is extremely fast. It travels at ⅔ the speed of light in a vacuum, moving through fiber at 200,000,000mps.

However, there’s a misconception about how data travels. It doesn’t flash straight from point A to point B.

Instead, data moves from node to node, and its path may seem counterintuitive.

It may have much farther to go than the straight linear distance between two points.

If an application can do some or all of its computing on the host device (or maybe a nearby node), the distance data has to travel is greatly reduced.

That means faster responses and better overall performance.

One of the best examples of this is the “HTTPs Problem”.

Handshakes are necessary for secure applications, but they require multiple round trips. That can slow the application’s performance significantly.

edge-computing-2

Edge computing allows the handshake to happen on the edge, so round trips become much faster.

Running Applications on The Edge

There are many ways to run an application on the edge:

Distributed servers with geographic load balancing

Software is used to manage traffic levels across a connected series of geographically distant servers.

[email protected]

Used on content delivered by Cloudflare, this allows Lambda functions to be executed in Amazon Web Services (AWS) locations closer to the viewer.

Cloudflare Workers

Service Workers run on Cloudflare’s edge instead of within a browser.

IoT and Mobile Devices

Computation-heavy tasks like vision recognition or language processing are performed largely on-device, with more difficult tasks sent to a central hub for more in-depth processing.

These applications put data where it’s needed most- in the hands of people going about their daily life. However, they all share a common weakness: distance from central data.

The Data Dilemma

When computing on the edge, where is the data?

Edge applications don’t operate in a vacuum. They need to access data from the edge as well.

That means data has to be replicated globally since an edge application that accesses the data from the origin still creates high network latency.

There are some popular distributed database solutions, including:

  • Dynamo Global Tables
  • Cockroach DB
  • Mongo DB
  • Custom replication mechanisms

They have issues, though. Distributed databases have extremely high costs, 2-5 times the costs of a single region database.

Storage is expensive, as well, and they cover a relatively small portion of the planet (around 5 regions at best).

Imagining a Better Solution

The “data dilemma” has an innovative, enterprise-forward solution already under development. It’s built on two core concepts:

  • Mature, economical CDNs (Content Delivery Networks) are already in place. CloudFront has 139 edges, and Cloudflare has 160 edges at last count. CDNs bill for bandwidth usage or per GB transfer.
  • There is plenty of cheap storage available. AWS S3 has the most inexpensive real time access storage, and there are other options if that’s not a good fit for a specific project.

Why not leverage CDNs and cloud storage solutions to host a simple document DB?

This is the idea behind “NearDB”, a simple document database made for infinitely scalable globally distributed reads.

The stack is simple and effective:

  • Typescript
  • JavaScript based, like most edge application platforms
  • Supports any storage that has S3 compatible APIs (AWS S3, Google Cloud Storage, Digital Ocean Spaces, Minio)
  • Bring your own CDN

NearDB has the edge in two key areas: speed and cost. It could potentially be the most scalable globally distributed database in the world.

Costs are 10-20 times cheaper than DynamoDB’s Global Table.

The tool does have some weaknesses. There’s no consolidation algorithm- writes always happen in the origin.

There are no transactions (yet). Also, indices and collection query are a concern. They can be done, but without transactions it becomes very risky.

Looking Forward

NearBD is still under development. It’s potential is intriguing, though.

Edge computing is where technology is headed, and NearDB just might be the right tool to make it more effective.

NearDB is just one way Concepta’s team of expert developers are working to improve enterprise technology. To find out how we can leverage our skills to meet your business goals, set up a free consultation today!

Request a Consultation

Leo Farias is the CEO and Co-founder at Concepta. He received his MPS in Business of Art & Design from the Maryland Institute College of Art. With over 18 years of technology-focused experience, he plays a vital role in architecting and leading various mission-critical projects for world-renowned clients like Time Warner Music, Orlando City Soccer, Vasco de Gama and Corinthians Soccer Club.