Домой United States USA — software Design a Real-Time ETA Prediction System Using Kafka, DynamoDB, and Rockset

Design a Real-Time ETA Prediction System Using Kafka, DynamoDB, and Rockset

464
0
ПОДЕЛИТЬСЯ

These are strange times. Cities are in lockdown, and few are venturing outside. Therefore, the increased use of on-demand logistics services, like online foo…
Let’s be friends:
Comment (0)
Join the DZone community and get the full member experience.
These are strange times. Cities are in lockdown, and few are venturing outside. Therefore, the increased use of on-demand logistics services, like online food delivery, doesn’t come as a surprise.
Most of these applications provide a near real-time tracking of the ETA once you place the order. Building a scalable, distributed, and real-time ETA prediction system is a tough task, but what if we could simplify its design? We’ll break our system into pieces such that each component is responsible for one primary job.
Let’s take a look at components that constitute the system.
To get an accurate ETA estimation, you will need the delivery person’s position, specifically the latitude and longitude. You can get this information easily via GPS in a device. A call to the device GPS provider returns latitude, longitude, and the accuracy of the location in meters.
You can run a background service in the app that retrieves the GPS coordinates every 10 seconds. The coordinates, as such, are too fine-grained to make a prediction. To increase the granularity of the GPS, we will be using the concept of geohash. A geohash is a standardized N-letter hash of a location that represents an area of M sq. miles. N and M are inversely proportional, so a larger N represents a smaller area M. You can refer to this for more info on geohash.
There are tons of libraries available to convert latitude-longitude to geohash. Here we’ll be using geo by davidmoten to get a 6-7 letter geohash.
The service then pushes the geohash along with the coordinates to a Kafka topic. Rockset ingests data from this Kafka topic and updates it into a collection called locations.
The orders placed by a customer are stored in DynamoDB for further processing. An order generally goes through a life cycle consisting of the following states:
All of the above state changes are updated in DynamoDB along with additional data such as the source location, destination location, order details, etc.

Continue reading...