Skip to content

Best Practices

This guide provides recommendations and best practices for using SpeedyNodes effectively. Following these guidelines will help you optimize your application's performance and reduce costs.

Request Optimization

Use Gzip Compression

Using gzip compression can significantly reduce data transfer size, especially for heavy calls. This helps you stay within rate limits while improving response times.

Example using curl:

curl -H "Accept-Encoding: gzip" \
     -H "Content-Type: application/json" \
     -X POST https://api.speedynodes.net/http/bsc-http?apikey=YOUR_API_KEY \
     --data '{"jsonrpc":"2.0","method":"eth_getBlockReceipts","params":["0x2e60d60"],"id":1}' \
     --compressed

This can reduce bandwidth usage by up to 90% for methods like eth_getBlockReceipts. See our Rate Limits guide for more compression examples.

Batch Requests

Instead of making multiple individual JSON-RPC requests, use batch requests to combine multiple calls into a single HTTP request. This reduces network overhead and improves performance.

Example of a batch request:

[
  {"jsonrpc": "2.0", "method": "eth_blockNumber", "params": [], "id": 1},
  {"jsonrpc": "2.0", "method": "eth_getBalance", "params": ["0x742d35Cc6634C0532925a3b844Bc454e4438f44e", "latest"], "id": 2},
  {"jsonrpc": "2.0", "method": "eth_gasPrice", "params": [], "id": 3}
]

Use Event Subscriptions

For applications that need real-time updates, use WebSocket connections with eth_subscribe instead of polling with HTTP requests. This reduces the number of requests and provides more immediate updates.

Example with Web3.js:

const web3 = new Web3('wss://api.speedynodes.net/ws/eth-ws?apikey=YOUR_API_KEY');

// Subscribe to new blocks
web3.eth.subscribe('newHeads', (error, result) => {
  if (!error) {
    console.log('New block:', result.number);
  }
});

// Subscribe to specific contract events
const contract = new web3.eth.Contract(ABI, CONTRACT_ADDRESS);
contract.events.Transfer({
  fromBlock: 'latest'
}, (error, event) => {
  if (!error) {
    console.log('Transfer event:', event);
  }
});

Cache Responses

Implement caching for responses that don't change frequently or historical data. This reduces the number of requests to our nodes and improves your application's response time.

Example caching pattern:

const cache = new Map();
const CACHE_TTL = 60000; // 1 minute in milliseconds

async function getBlockWithCache(blockNumber) {
  const cacheKey = `block:${blockNumber}`;
  const now = Date.now();

  // Check cache
  if (cache.has(cacheKey)) {
    const cached = cache.get(cacheKey);
    if (now - cached.timestamp < CACHE_TTL) {
      return cached.data;
    }
  }

  // If not in cache or expired, fetch from RPC
  const block = await web3.eth.getBlock(blockNumber);

  // Store in cache
  cache.set(cacheKey, {
    timestamp: now,
    data: block
  });

  return block;
}

Use Archive Nodes Wisely

Archive nodes provide access to historical state data but have higher rate limits. Only use archive nodes when you need historical data or trace functionality. For standard interactions with the blockchain, use full nodes.

Connection Management

Implement Retry Logic

Network connections can occasionally fail. Implement retry logic with exponential backoff to handle temporary disruptions.

Example retry function:

async function fetchWithRetry(jsonRpcRequest, maxRetries = 3, initialDelay = 1000) {
  let retries = 0;
  let delay = initialDelay;

  while (retries < maxRetries) {
    try {
      const response = await fetch('https://api.speedynodes.net/http/eth-http?apikey=YOUR_API_KEY', {
        method: 'POST',
        headers: { 
          'Content-Type': 'application/json',
          'Accept-Encoding': 'gzip'
        },
        body: JSON.stringify(jsonRpcRequest)
      });

      if (response.ok) {
        return await response.json();
      }
    } catch (error) {
      console.log(`Attempt ${retries + 1} failed: ${error.message}`);
    }

    // Exponential backoff
    await new Promise(resolve => setTimeout(resolve, delay));
    delay *= 2;
    retries++;
  }

  throw new Error('Max retries exceeded');
}

Use Connection Pooling

For applications with high throughput, implement connection pooling to reuse connections rather than creating new ones for each request.

Implement Client-Side Rate Limiting

Implement rate limiting on your side to prevent exceeding the allowed requests per second.

Example rate limiter:

class RateLimiter {
  constructor(maxRequestsPerSecond) {
    this.maxRequestsPerSecond = maxRequestsPerSecond;
    this.requestTimestamps = [];
  }

  async throttle() {
    const now = Date.now();

    // Remove timestamps older than 1 second
    this.requestTimestamps = this.requestTimestamps.filter(
      timestamp => now - timestamp < 1000
    );

    if (this.requestTimestamps.length >= this.maxRequestsPerSecond) {
      // Wait until we can make another request
      const oldestTimestamp = this.requestTimestamps[0];
      const timeToWait = 1000 - (now - oldestTimestamp);
      if (timeToWait > 0) {
        await new Promise(resolve => setTimeout(resolve, timeToWait));
      }
    }

    // Add current timestamp to the list
    this.requestTimestamps.push(Date.now());
  }

  async executeRequest(requestFn) {
    await this.throttle();
    return requestFn();
  }
}

// Usage
const limiter = new RateLimiter(500); // 500 requests per second
limiter.executeRequest(() => web3.eth.getBlockNumber());

Security Best Practices

Secure Your API Key

  • Never hardcode your API key in client-side code
  • Store your API key in environment variables or secure vaults
  • Restrict API key usage to specific domains if possible
  • Regenerate your API key if you suspect it has been compromised

Use HTTPS for All Requests

Always use HTTPS endpoints to ensure encrypted communication with our nodes.

Implement Rate Limiting for Your APIs

If you're building a service on top of SpeedyNodes, implement rate limiting for your users to prevent abuse and ensure fair usage.

Debugging and Optimization

Log Request Metrics Locally

Track metrics for your RPC requests to identify performance issues:

  • Request latency
  • Success/error rates
  • Request volume by method

Use Multiple Chains for Load Distribution

If your application supports multiple blockchain networks, distribute load across them to optimize throughput and reduce per-chain request rates.

Test with the Free Trial

Before deploying to production, use our free trial to test your application's integration with SpeedyNodes and identify any issues.

Plan Selection Guidelines

When to Choose Higher Tiers

  • Tier 1: Ideal for individual developers or small projects focused on a single blockchain
  • Tier 2: Best for projects working with multiple blockchains simultaneously
  • Tier 3: Perfect for applications requiring access to all supported blockchains
  • Private Node: Recommended for:
  • Projects with very high RPS requirements
  • Applications where consistent low latency is critical
  • Projects requiring custom configurations

When to Use Archive Nodes

You should use archive nodes when:

  • You need to query historical state (e.g., account balances at specific past blocks)
  • You require trace functionality (trace_* methods)
  • You're analyzing historical transaction data
  • You're building blockchain explorers or analytics tools

Support and Troubleshooting

Common Issues

If you encounter issues with your connection:

  1. Verify your API key is correct
  2. Check that you're using the correct endpoint URL
  3. Ensure you're not exceeding your plan's rate limits
  4. Verify your JSON-RPC request format is correct

Getting Help

If you need assistance: