Translating a Video into Multiple Languages with AI

Overview

The AI translation feature lets you turn a single video into a multilingual viewing experience. Viewers watching the video in the SundaySky player can choose their preferred language from the available translations, making it easy to reach a broader audience, even when you don't know a viewer's language in advance. If you don't use the SundaySky player, the feature still enables you to quickly generate translated versions from a single source video.

Another key benefit is simplified maintenance. Instead of managing and updating a separate video for each language, you maintain a single source video. Any changes you make — such as updating content or correcting a mistake — are automatically applied across all translated languages, saving time and reducing the risk of inconsistencies.

You can translate a video from its source language into up to three additional languages, for a total of four supported languages.

The following elements are translated for the video:

  • Narration
  • Closed captions
  • Video transcript

Limitations

The AI translation feature has the following limitations:

1. Videos that include an avatar cannot be translated using this feature. To translate the video, you must first remove the avatar.
 
View the message you'll see

Avatar_message.png

2. Templates do not support multiple languages. If you translate a video into multiple languages and then save it as a template using Save as template, you'll see a message indicating that only the source language will be kept. All additional languages will be removed from the template.
 
View the message you'll see

Template_message.png


AI Usage Credits for Narration Translation

Using the AI translation feature requires AI usage credits. Your available credit balance is displayed at the top of the Languages panel.

Credit_Balance_80.png

You can review, edit, and preview narration translations without consuming any credits. Credits are used only when you approve the video.

How credits are calculated:
1. Credits are based on narration length, but are calculated using the full duration of each scene. If a scene contains 4 seconds of narration but the scene itself is 5 seconds long, the full 5 seconds are counted toward credit usage.
2. One second of narration consumes one AI usage credit.
3. The source language does not consume credits — only the additional translated languages do.
Example:

Assume your video's source language is English and includes six scenes, each five seconds long (30 seconds total). The video is translated into French and Italian.

  • French: 6 scenes x 5 seconds = 30 seconds
  • Italian: 6 scenes x 5 seconds = 30 seconds

In total, approving these translations will consume approximately 60 AI usage credits. The actual number may vary slightly, as translations can cause individual scenes to be shorter or longer than the original narration.

Important notes:
1. If you remove a language after the video is approved, the generated translation content is deleted, but the credits used are not restored.
2. Credits are always calculated per scene. Even if you edit a single word in a translated scene, credits are calculated based on the full duration of that scene, not just the portion that was changed.

To get the best results when translating a video with AI, we recommend following this workflow:

1. Select the video's source language.
This is the language of the original narration.
2. Finalize the source narration.
This step is especially important. Any changes you make to the source narration automatically overwrite the translated versions. For this reason, we recommend completing and stabilizing the source narration before continuing.
3. Select the target languages.
Choose the languages you want to translate the video into (a minimum of one and up to three).
4. Review and edit the translations.
Because translations are generated using AI, it's important to review all translated narration and make any necessary adjustments before approving the video.
The following sections describe these steps in detail.

Selecting the Source Language

1. Open the relevant video in the Studio.
2. Click the language selector in the top bar and then click Change narration source language.
  Language_indicator.png
3. From the drop-down list, select the video's source language.
  Language_selection.png

Selecting the Target Languages

1. Make sure the AI voice selected for the video supports multilingual narration.
▶ Voices that support this feature include stock voices marked with a globe icon, as well as any custom AI voices you've added.
2. Click Languages in the sidebar.
▶ Alternatively, use the language selector in the top bar and then click Add more languages.
3. In the Languages panel, click Add up to 3 languages.
▶ If this is your first time using this feature, you'll be prompted to provide consent by accepting the Consent Form.
4. Select the languages you want to translate into (up to three).
▶ You can click the play button next to each language to preview the voice in the selected language.
  Language_selections.png
5. Click Enable translation.
  A confirmation message is displayed, indicating that the selected languages have been applied successfully.
  Translation_confirmation.png
  After completing these steps, clicking the language selector in the top bar displays the source language (marked with a distinct icon) and the translated languages you selected.
Changing the language here updates all narration placeholders in the video to the selected language.
  Top_bar_selections.png

Reviewing and Editing AI Translations

Before you begin editing translations, make sure the source narration is finalized. Any changes to the source narration automatically overwrite the translated versions.

While there are different ways to review and edit translated content, this section describes the approach we recommend for the most effective results.


1. In the top bar, use the language selector to switch to the language you want to work on.
  Note that the narration placeholders now contain translated text based on the selected language.
▶ In the example below, the selected narration language is Italian.
  Placeholder_Italian.png
2. In the scene you want to work on, double-click the narration placeholder.
  The scene's narration is displayed in the Narration dialog window.
  Narration_dialog_window.png
3. Click Show source in the dialog window.
  The translation window is displayed. On the left, you'll see the translated narration, and on the right, the original narration in the source language. This side-by-side view makes it easy to edit the translation while referencing the original text.
  Translation_window.png
4. Review the translation and edit as needed.
▶ As you edit, you can click the play button beneath the translation to preview the narration.
5. When you're finished editing, click anywhere on the page.
6. (Optional) Preview the entire scene by clicking the preview button in the Workspace.
▶ By default, the preview plays in the translation language you edited.
7. Repeat steps 2–6 for all scenes in the video.

Approving a Video with Translations

Approving a video with translations is no different from approving a regular video. The only difference is that AI usage credits are deducted from your account at this stage.


1. Open the relevant video on its Video Page.
2. Note the Languages section, where the source language and translated languages are indicated.
  Video_page_languages.png
3. Click Approve.
4. Note your current balance of AI usage credits and the approximate number of credits that will be consumed to translate the current video.
  Approve_video.png
5. Click Approve.
  The video is approved, and a confirmation message appears at the bottom of the page, indicating exactly how many credits were used.
  Confirmation_message.png

Sharing a Video with Translations

How translations are implemented depends on how you share the video.
When you share a video using the SundaySky landing page or embed code and play it in the SundaySky player, viewers can select their preferred language directly from the control bar.
If you share the video on social media (Facebook or LinkedIn) or download it as an MP4, you choose the language during the sharing process. In this case, you still work with a single source video and can select a different language each time you share it.

Sharing a Translated Video with the SundaySky Player
Sharing a Translated Video on Social Media or as an MP4


Sharing a Translated Video with the SundaySky Player
1. Clicking the language selector displays all available languages. When a viewer changes the language, the narration updates accordingly.
  Player_language_selector.png
2. Clicking CC (closed captions) displays all available languages, along with an option to turn captions off. In practice, viewers can listen to the narration in one language while displaying captions in a different language.
  Closed_captions.png

Sharing a Translated Video on Social Media or as an MP4

With this method, the language is selected before sharing the video. In the example below, the language is chosen before downloading the video, creating a copy in the requested language.

Download_language_selection.png


Editing Languages and Translations

As you work on a video, you can make edits at any time. This section outlines the types of edits you can perform. Once the video is updated, it must be approved again before it can be shared. As with the initial approval, re-approving a video also consumes AI usage credits.

Adding a Language
Removing a Language
Editing a Translation
Adding a Scene to a Video with Translations


Adding a Language
1. Open the relevant video in the Studio.
2. Click Languages in the sidebar.
3. In the Languages panel, click the pencil icon.
  Edit_to_add.png
4. Add the new language.
  New_language_added.png
5. Click Apply changes.
6. Review and edit the translation for the newly added language.

Removing a Language
1. Open the relevant video in the Studio.
2. Click Languages in the sidebar.
3. In the Languages panel, click the pencil icon.
  Edit_to_remove.png
4. Click the X next to the language you want to remove.
  Remove_language.png
5. Click Apply changes.
  The language you removed is no longer included in the video.

Editing a Translation
1. Open the relevant video in the Studio.
2. In the relevant scene, double-click the narration placeholder.
3. From the drop-down list, select the translation you want to edit.
  Select_translation.png
4. Edit the translation.
▶ Note that changes you make to this translation will not affect any other translations or the narration in the source language. This is a manual edit specific to this language only.
  Edit_the_translation.png
  Note the message displayed when you re-approve the video. Here's what it means:
In our example, one word was added to a scene in one language. Because the scene's duration is approximately 13 seconds, 13 AI usage credits will be consumed. Credits are calculated based on the full duration of the scene, not just the portion that was added or removed.
  Re-approval message.png

Adding a Scene to a Video with Translations
1. Open the relevant video in the Studio.
2. In the scene line-up, select the scene after which you want to add a new one.
3. Click the + sign at the end of the scene line-up to add a new scene.
4. From the Scene Library, select the desired scene.
5. Double-click the narration placeholder in the newly added scene.
  In the Narration dialog window, note that the drop-down list for translations cannot be opened because the narration in the source language has not yet been added to the scene.
  Translations_unavailable.png
6. Add the narration in the source language.
  The drop-down list can now be opened to view the narration translations.
  Drop_down_enabled.png
7. Review the translations for the new scene and make any necessary edits.
  Additional_translations.png

Handling Personalized Narration in Translations

Just as in a single-language video, multilingual videos with translations support both personalization tokens for 1:1 personalization and audience messaging. The sections below explain how these personalization methods are handled when translations are enabled.

1:1 Personalization with a Personalization Token
Message by Audience


1:1 Personalization with a Personalization Token

In the source-language narration, the personalization token is inserted in the appropriate position for that language. In the example below, the Location personalization token displays the medical complex location based on the viewer's data.

Personalization_token.png

The platform automatically handles the personalization token, inserting it into the translated text based on AI language understanding. Even so, it's still important to review the translated text to ensure it is correct.
In translations, you can only use personalization tokens that were added to the source language; other personalization tokens from the Data Library cannot be used.

Italian_personalization_token.png


Message by Audience

When using Message by Audience, you create variations of a narration based on the values of a data field.
In the Narration dialog window, these narrations behave just like regular narrations: the narration written in the source language determines what is narrated in the translations, and you can then edit the translated narration in the standard manner.

When a video that uses Message by Audience is approved, the number of narration variations in a scene is used to calculate the AI usage credits consumed for that scene.

Example:
Assume a scene is translated into Italian and French.

  • The Italian translation includes three variations (5 seconds each) and one fallback narration (5 seconds).
  • The French translation also includes three variations (5 seconds each) and one fallback narration (5 seconds).

Each translation therefore totals 20 seconds of narration.
Combined, both languages for a single scene consume 40 AI usage credits.

Was this article helpful?
0 out of 0 found this helpful

Still Have Questions?

SundaySky Support is here for you

contact support