Home/Blog/React i18n with a Translation API: Automating Translations in React Apps
Tutorial

React i18n with a Translation API: Automating Translations in React Apps

How to connect a translation API to your React i18n workflow. Covers react-i18next setup, automated string extraction, API integration for generating translations, and a CI/CD pipeline that keeps translations in sync.

Thomas van Leer· Content Manager, LangblyFebruary 18, 202611 min read

Most React i18n tutorials stop at setting up react-i18next and creating JSON translation files manually. That works for 2-3 languages with a small team. It falls apart when you're supporting 10+ languages and shipping features weekly. Manually translating JSON files is slow, error-prone, and creates a bottleneck that delays every release.

This guide goes further: how to connect a translation API to your React i18n workflow so translations are generated automatically, stay in sync with your source strings, and don't slow down your development cycle.

The setup: react-i18next basics

If you're already using react-i18next, skip to the automation section. For everyone else, here's the minimal setup.

Install the dependencies:

npm install react-i18next i18next i18next-http-backend i18next-browser-languagedetector

Create your i18n configuration:

// src/i18n.ts
import i18n from 'i18next';
import { initReactI18next } from 'react-i18next';
import Backend from 'i18next-http-backend';
import LanguageDetector from 'i18next-browser-languagedetector';

i18n
  .use(Backend)
  .use(LanguageDetector)
  .use(initReactI18next)
  .init({
    fallbackLng: 'en',
    supportedLngs: ['en', 'de', 'fr', 'es', 'ja', 'nl'],
    interpolation: {
      escapeValue: false, // React already escapes
    },
    backend: {
      loadPath: '/locales/{{lng}}/{{ns}}.json',
    },
  });

export default i18n;

Your source strings go in public/locales/en/translation.json:

{
  "nav": {
    "home": "Home",
    "settings": "Settings",
    "logout": "Log out"
  },
  "dashboard": {
    "welcome": "Welcome back, {{name}}",
    "projects": "You have {{count}} active project",
    "projects_plural": "You have {{count}} active projects"
  }
}

Use translations in components:

import { useTranslation } from 'react-i18next';

function Dashboard({ user }) {
  const { t } = useTranslation();

  return (
    <div>
      <h1>{t('dashboard.welcome', { name: user.name })}</h1>
      <p>{t('dashboard.projects', { count: user.projectCount })}</p>
    </div>
  );
}

This is the standard approach. The problem starts when you need public/locales/de/translation.json, public/locales/fr/translation.json, and so on. Somebody has to create and maintain those files.

Automating translations with an API

Instead of manually translating each JSON file, write a script that reads your English source file, sends untranslated strings to a translation API, and writes the results to the target language files.

The translation script

// scripts/translate.ts
import fs from 'fs';
import path from 'path';

const API_URL = 'https://api.langbly.com/language/translate/v2';
const API_KEY = process.env.TRANSLATION_API_KEY;
const LOCALES_DIR = path.join(__dirname, '../public/locales');
const SOURCE_LANG = 'en';
const TARGET_LANGS = ['de', 'fr', 'es', 'ja', 'nl'];

async function translate(text: string, target: string): Promise<string> {
  const res = await fetch(API_URL, {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({
      q: text,
      source: SOURCE_LANG,
      target,
      key: API_KEY,
    }),
  });

  const data = await res.json();
  return data.data.translations[0].translatedText;
}

function flattenJson(obj: any, prefix = ''): Record<string, string> {
  const result: Record<string, string> = {};
  for (const [key, value] of Object.entries(obj)) {
    const fullKey = prefix ? prefix + '.' + key : key;
    if (typeof value === 'object' && value !== null) {
      Object.assign(result, flattenJson(value, fullKey));
    } else {
      result[fullKey] = String(value);
    }
  }
  return result;
}

function unflattenJson(flat: Record<string, string>): any {
  const result: any = {};
  for (const [key, value] of Object.entries(flat)) {
    const parts = key.split('.');
    let current = result;
    for (let i = 0; i < parts.length - 1; i++) {
      if (!current[parts[i]]) current[parts[i]] = {};
      current = current[parts[i]];
    }
    current[parts[parts.length - 1]] = value;
  }
  return result;
}

async function translateFile(targetLang: string) {
  const sourcePath = path.join(LOCALES_DIR, SOURCE_LANG, 'translation.json');
  const targetPath = path.join(LOCALES_DIR, targetLang, 'translation.json');

  const source = JSON.parse(fs.readFileSync(sourcePath, 'utf-8'));
  const existing = fs.existsSync(targetPath)
    ? JSON.parse(fs.readFileSync(targetPath, 'utf-8'))
    : {};

  const sourceFlat = flattenJson(source);
  const existingFlat = flattenJson(existing);

  // Only translate new or changed strings
  const toTranslate: Record<string, string> = {};
  for (const [key, value] of Object.entries(sourceFlat)) {
    if (!existingFlat[key]) {
      toTranslate[key] = value;
    }
  }

  console.log('[' + targetLang + '] ' + Object.keys(toTranslate).length + ' new strings to translate');

  const translated = { ...existingFlat };
  for (const [key, value] of Object.entries(toTranslate)) {
    translated[key] = await translate(value, targetLang);
  }

  // Remove keys that no longer exist in source
  for (const key of Object.keys(translated)) {
    if (!sourceFlat[key]) delete translated[key];
  }

  const targetDir = path.dirname(targetPath);
  if (!fs.existsSync(targetDir)) fs.mkdirSync(targetDir, { recursive: true });

  fs.writeFileSync(
    targetPath,
    JSON.stringify(unflattenJson(translated), null, 2) + '\n'
  );
}

async function main() {
  for (const lang of TARGET_LANGS) {
    await translateFile(lang);
  }
  console.log('Done!');
}

main();

This script does several smart things:

  • Incremental translation: Only translates strings that don't exist in the target file. Existing translations are preserved.
  • Cleanup: Removes translations for keys that no longer exist in the source file.
  • Structure preservation: Flattens nested JSON for translation, then unflatten back to the original structure.

Add it to your package.json:

"scripts": {
  "translate": "tsx scripts/translate.ts"
}

Run it: TRANSLATION_API_KEY=your_key npm run translate. See the quickstart guide for getting your API key.

Handling plurals and interpolation

The tricky part of automated translation is preserving i18next features like interpolation and pluralization.

Interpolation variables

Strings like Welcome back, {{name}} contain interpolation variables that must survive translation. A good translation API preserves {{name}} as-is. Langbly and Google Translate both handle this correctly because double curly braces aren't natural language.

If your API mangles interpolation variables, add a pre-processing step that replaces them with placeholder tokens before translation and restores them after:

function protectVariables(text: string): [string, Map<string, string>] {
  const map = new Map<string, string>();
  let i = 0;
  const protected = text.replace(/\{\{\w+\}\}/g, (match) => {
    const token = '__VAR' + (i++) + '__';
    map.set(token, match);
    return token;
  });
  return [protected, map];
}

function restoreVariables(text: string, map: Map<string, string>): string {
  let result = text;
  for (const [token, original] of map) {
    result = result.replace(token, original);
  }
  return result;
}

Plural forms

i18next handles pluralization with suffixed keys: key, key_plural (English), or key_0, key_1, key_2 (languages with more plural forms).

The challenge: different languages have different plural rules. English has 2 forms (singular, plural). French has 2. Arabic has 6. Japanese has 1.

Your translation script needs to know the plural rules for each target language and generate the right number of plural variants. The CLDR plural rules define this. For a basic implementation:

const PLURAL_FORMS: Record<string, string[]> = {
  en: ['', '_plural'],
  de: ['', '_plural'],
  fr: ['', '_plural'],
  ja: [''],  // Japanese has no plural distinction
  ar: ['_0', '_1', '_2', '_3', '_4', '_5'],
};

When you encounter a plural key in the source, generate translations for each plural form the target language needs. This adds complexity but it's necessary for correct localization.

CI/CD integration

The real value of automated translation comes when you integrate it into your development pipeline. Translations should update automatically when source strings change.

GitHub Actions example

# .github/workflows/translate.yml
name: Update translations
on:
  push:
    paths:
      - 'public/locales/en/**'
    branches: [main]

jobs:
  translate:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with:
          node-version: 20
      - run: npm ci
      - run: npm run translate
        env:
          TRANSLATION_API_KEY: ${{ secrets.TRANSLATION_API_KEY }}
      - uses: stefanzweifel/git-auto-commit-action@v5
        with:
          commit_message: 'chore: update translations'
          file_pattern: 'public/locales/**/*.json'

This workflow triggers whenever the English source strings change. It runs the translation script and commits the updated translation files back to the repo. Translators or reviewers can then check the changes in a PR.

Alternative: TMS integration

For larger projects, consider using a translation management system like Crowdin or Lokalise instead of a direct API script. TMS platforms provide:

  • A web interface for translators to review and edit machine translations
  • Translation memory to avoid re-translating identical strings
  • Git integration that automatically syncs translation files
  • Quality assurance checks

The TMS approach adds a review step between machine translation and deployment, which improves quality at the cost of speed. For user-facing applications, this trade-off is usually worth it. For internal tools, the direct API approach might be enough.

Performance considerations

Loading translations efficiently

With the i18next-http-backend, translations load from JSON files at runtime. This means a network request for each language namespace. For apps supporting many languages, lazy-load translations:

// Only load the detected/selected language, not all languages
i18n.init({
  // ...
  load: 'currentOnly', // Don't load fallback language eagerly
  backend: {
    loadPath: '/locales/{{lng}}/{{ns}}.json',
  },
});

Bundle size

For small apps, you might want to bundle translations into the JavaScript bundle instead of loading them via HTTP. Use i18next with resources instead of the HTTP backend:

import en from '../public/locales/en/translation.json';
import de from '../public/locales/de/translation.json';

i18n.init({
  resources: {
    en: { translation: en },
    de: { translation: de },
  },
  // No backend plugin needed
});

This eliminates the loading delay but increases your bundle size. For 5 languages with 500 strings each, the JSON adds roughly 50-100KB uncompressed (10-20KB gzipped). Reasonable for most apps.

Costs at scale

How much does automated React i18n translation cost with an API?

A typical React app has 200-2,000 translatable strings. At an average of 30 characters per string:

  • 500 strings, 5 languages: 75,000 characters = well within any free tier
  • 2,000 strings, 10 languages: 600,000 characters = within Langbly's free tier (500K) for initial translation, minimal overage on updates
  • 5,000 strings, 15 languages: 2.25M characters = $8.55 with Langbly Starter ($3.80/M), $45 with Google Translate ($20/M)

After the initial translation, incremental costs are much lower because you're only translating new or changed strings. A typical weekly release might add 20-50 new strings, which is pennies with any API.

For a detailed cost comparison across APIs, see the cheapest translation API guide.

Quality review workflow

Automated translation gets you 85-95% of the way there. The remaining 5-15% needs human review, especially for user-facing content.

A practical review workflow for React apps:

  1. Developer adds new strings to the English source file
  2. CI pipeline generates translations via API
  3. Translations are committed to a branch and a PR is opened
  4. A native speaker reviews the changed strings in the PR diff
  5. Corrections are made directly in the JSON files
  6. PR is merged, translations ship with the next release

The quality management guide covers how to structure this review process at scale, including which strings to prioritize for review and how to measure translation quality over time.

Related reading

Reacti18nreact-i18nextTranslation APILocalization

Translate your React app for $1.99/M characters

Langbly is Google Translate v2 compatible. Drop it into your React i18n workflow with a single API call.