Bringing the Power of Context API On-Premises

We’re excited to introduce our latest open-source innovation: the ability to deploy our Context API on-premises. This new project, now available on GitHub, allows you to seamlessly integrate the comprehensive features of our Context API directly into your local environment.

Explore our GitHub Project

Background

Initially conceived as a demonstration tool for customers, this project showcased how to utilize our feeds on-premises. It has evolved significantly, now doubling as a method for running a local, Context API-like service. This transformation reflects our commitment to providing versatile, secure, and efficient data solutions tailored to our clients’ evolving needs.

Why Run Context API On-Premises?

The primary benefit of deploying this on-premises solution is privacy and control. You can provide your internal customers with the same simplicity and efficiency as our Context API, but with all data requests processed locally. This means enhanced security, as sensitive data does not leave your network, and reduced latency, as requests are processed internally.

Seamless Integration and Localized Requests

Utilizing Redis, a high-performance key-value store, our open-source project guarantees seamless integration of this on-premises solution into your current systems. Not only does this setup support continuous feed ingestion and real-time data merging, but it also maintains the high performance and reliability you’ve come to expect from our services.

How do I use it?

To quickly get started you can use the following Docker Compose file. This will start a Redis server and the feed-example-redis app.
NOTE: You will need to set the SPUR_REDIS_API_TOKEN environment variable to your Spur token. Please see the configuration section of the github readme for more information on the other environment variables.

version: '3'

services:
  redis:
    image: redis
    ports:
      - "6379:6379"
    healthcheck:
      test: [ "CMD", "redis-cli", "--raw", "incr", "ping" ]
      interval: 30s
      timeout: 10s
      retries: 5

  spurredis:
    image: spurintelligence/spurredis:latest
    depends_on:
      redis:
        condition: service_healthy
    ports:
      - "8080:8080"
    environment:
      SPUR_REDIS_API_TOKEN: ${SPUR_REDIS_API_TOKEN}
      SPUR_REDIS_FEED_TYPE: anonymous
      SPUR_REDIS_CHUNK_SIZE: 5000
      SPUR_REDIS_CONCURRENT_NUM: 4
      SPUR_REDIS_ADDR: redis:6379
      SPUR_REDIS_LOCAL_API_AUTH_TOKENS: testtoken1,testtoken2

Then run the following command to test the API:

curl -vv -H "TOKEN: testtoken1" localhost:8080/v2/context/${YOUR_IP} | jq

System Requirements

To ensure optimal performance and stability of the on-premises Context API, the following system requirements must be met based on the expected usage:

Anonymous Feeds: A minimum of 5GB of memory is required to efficiently handle anonymous feeds within your system.

Anonymous Residential Feeds: For handling anonymous residential feeds, your system should have at least 18GB of memory.

Please note that these are the minimum requirements and may need to be adjusted based on the volume of requests your deployment is expected to handle.

Open Source and Community Supported

As an open-source project, we invite the community to contribute, enhancing the tool’s capabilities and ensuring it meets a wide range of needs. Whether you’re a small business looking to protect your privacy or a large enterprise in need of a scalable IP enrichment solution, our on-premises Context API is designed to adapt to your requirements.

Explore the possibilities today by visiting our GitHub repository, and empower your internal customers with the advanced data insights they need, securely and efficiently.

Similar articles