diff --git a/sources/platform/actors/development/actor_definition/docker.md b/sources/platform/actors/development/actor_definition/docker.md
index 9b604a88ac..a22718dd32 100644
--- a/sources/platform/actors/development/actor_definition/docker.md
+++ b/sources/platform/actors/development/actor_definition/docker.md
@@ -126,7 +126,7 @@ When the Playwright/Puppeteer version in your `package.json` differs from what's
:::
-### Using `*` as version (alternative approach)
+### Use `*` as version (alternative approach)
You may encounter older documentation or templates using `*` as the Playwright/Puppeteer version:
@@ -208,7 +208,7 @@ You can check out various optimization tips for Dockerfile in our [Performance](
:::
-## Updating older Dockerfiles
+## Update older Dockerfiles
All Apify base Docker images now use a non-root user to enhance security. This change requires updates to existing Actor `Dockerfile`s that use the `apify/actor-node`, `apify/actor-python`, `apify/actor-python-playwright`, or `apify/actor-python-selenium` images. This section provides guidance on resolving common issues that may arise during this migration.
@@ -293,7 +293,7 @@ You should remove these lines, as the new user is now `myuser`. Don't forget to
COPY --chown=myuser:myuser . ./
```
-#### Installing dependencies that require root access
+#### Install dependencies that require root access
The `root` user is still available in the Docker images. If you must run steps that require root access (like installing system packages with `apt` or `apk`), you can temporarily switch to the `root` user.
diff --git a/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md b/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md
index 32e863746a..c7450d2536 100644
--- a/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md
+++ b/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md
@@ -218,7 +218,7 @@ If the calculation results in an error, the Actor will start with a fixed defaul
-### Testing expressions
+### Test expressions
#### Use npm package
diff --git a/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md b/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md
index cea9169224..9fd8a2becc 100644
--- a/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md
+++ b/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md
@@ -102,4 +102,4 @@ It's possible to define custom error messages in sub-properties as well. For obj
## Best practices
-Custom error messages can be useful in specific cases, but they aren't always necessary. In most situations, the default validation messages are clear enough and ensure consistency across the platform. Use custom messages only when they meaningfully improve clarity—for example, when the default message would expose an unreadable regular expression or fail to explain a non-obvious requirement.
+Custom error messages can be useful in specific cases, but they aren't always necessary. In most situations, the default validation messages are clear enough and ensure consistency across the platform. Use custom messages only when they meaningfully improve clarity - for example, when the default message would expose an unreadable regular expression or fail to explain a non-obvious requirement.
diff --git a/sources/platform/actors/development/actor_definition/input_schema/specification.md b/sources/platform/actors/development/actor_definition/input_schema/specification.md
index 9d8d43153e..af4a1dca87 100644
--- a/sources/platform/actors/development/actor_definition/input_schema/specification.md
+++ b/sources/platform/actors/development/actor_definition/input_schema/specification.md
@@ -22,7 +22,7 @@ The Actor input schema file is used to:
To define an input schema for an Actor, set `input` field in the `.actor/actor.json` file to an input schema object (described below), or path to a JSON file containing the input schema object.
-For backwards compatibility, if the `input` field is omitted, the system looks for an `INPUT_SCHEMA.json` file either in the `.actor` directory or the Actor's top-level directory—but note that this functionality is deprecated and might be removed in the future. The maximum allowed size for the input schema file is 500 kB.
+For backwards compatibility, if the `input` field is omitted, the system looks for an `INPUT_SCHEMA.json` file either in the `.actor` directory or the Actor's top-level directory - but note that this functionality is deprecated and might be removed in the future. The maximum allowed size for the input schema file is 500 kB.
When you provide an input schema, the Apify platform will validate the input data passed to the Actor on start (via the API or Apify Console) to ensure compliance before starting the Actor.
If the input object doesn't conform the schema, the caller receives an error and the Actor is not started.
@@ -372,7 +372,7 @@ Properties:
| Property | Value | Required | Description |
|------------|-----------------------------------------------------|----------|-------------------------------------------------------------------------------|
-| `type` | One of
| Yes | Defines the type of the field — either an integer or a floating-point number. |
+| `type` | One of | Yes | Defines the type of the field - either an integer or a floating-point number. |
| `editor` | One of: | No | Visual editor used for input field. |
| `maximum` | Integer or Number
(based on the `type`) | No | Maximum allowed value. |
| `minimum` | Integer or Number
(based on the `type`) | No | Minimum allowed value. |
diff --git a/sources/platform/actors/development/actor_definition/output_schema/index.md b/sources/platform/actors/development/actor_definition/output_schema/index.md
index d4709248db..95a6545926 100644
--- a/sources/platform/actors/development/actor_definition/output_schema/index.md
+++ b/sources/platform/actors/development/actor_definition/output_schema/index.md
@@ -280,7 +280,7 @@ When a user runs the Actor in the Console, the UI will look like this:

-### Using container URL to display chat client
+### Use container URL to display chat client
In this example, an Actor runs a web server that provides a chat interface to an LLM.
The conversation history is then stored in the dataset.
diff --git a/sources/platform/actors/development/builds_and_runs/index.md b/sources/platform/actors/development/builds_and_runs/index.md
index 5c3c0c71fb..590f2170a8 100644
--- a/sources/platform/actors/development/builds_and_runs/index.md
+++ b/sources/platform/actors/development/builds_and_runs/index.md
@@ -11,7 +11,7 @@ slug: /actors/development/builds-and-runs
Actor **builds** and **runs** are fundamental concepts within the Apify platform. Understanding them is crucial for effective use of the platform.
-## Building an Actor
+## Build an Actor
When you start the build process for your Actor, you create a _build_. A build is a Docker image containing your source code and the required dependencies needed to run the Actor:
@@ -27,7 +27,7 @@ flowchart LR
AD -- "build process" --> Build
```
-## Running an Actor
+## Run an Actor
To create a _run_, you take your _build_ and start it with some input:
diff --git a/sources/platform/actors/development/builds_and_runs/state_persistence.md b/sources/platform/actors/development/builds_and_runs/state_persistence.md
index 218b88f387..2c2c48972c 100644
--- a/sources/platform/actors/development/builds_and_runs/state_persistence.md
+++ b/sources/platform/actors/development/builds_and_runs/state_persistence.md
@@ -21,7 +21,7 @@ To prevent data loss, long-running Actors should:
For short-running Actors, the risk of restarts and the cost of repeated runs are low, so you can typically ignore state persistence.
-## Understanding migrations
+## Understand migrations
A migration occurs when a process running on one server must stop and move to another. During this process:
@@ -45,7 +45,7 @@ Migrations don't follow a specific schedule. They can occur at any time due to t
By default, an Actor keeps its state in the server's memory. During a server switch, the run loses access to the previous server's memory. Even if data were saved on the server's disk, access to that would also be lost. Note that the Actor run's default dataset, key-value store and request queue are preserved across migrations, by state we mean the contents of runtime variables in the Actor's code.
-## Implementing state persistence
+## Implement state persistence
The [Apify SDKs](/sdk) handle state persistence automatically.
diff --git a/sources/platform/actors/development/deployment/continuous_integration.md b/sources/platform/actors/development/deployment/continuous_integration.md
index 9f7d0e3bc7..93737196c3 100644
--- a/sources/platform/actors/development/deployment/continuous_integration.md
+++ b/sources/platform/actors/development/deployment/continuous_integration.md
@@ -32,7 +32,7 @@ Set up continuous integration for your Actors using one of these methods:
Choose the method that best fits your workflow.
-## Option 1: Trigger builds with a Webhook
+## Option 1: Trigger builds with a webhook
1. Push your Actor to a GitHub repository.
1. Go to your Actor's detail page in Apify Console, click on the API tab in the top right, then select API Endpoints. Copy the **Build Actor** API endpoint URL. The format is as follows:
diff --git a/sources/platform/actors/development/permissions/index.md b/sources/platform/actors/development/permissions/index.md
index 181990ce0b..2329a3ed74 100644
--- a/sources/platform/actors/development/permissions/index.md
+++ b/sources/platform/actors/development/permissions/index.md
@@ -45,7 +45,7 @@ To learn how to migrate your Actors to run under limited permissions, check out
:::
-### Configuring Actor permissions level
+### Configure Actor permissions level
You can set the permission level for your Actor in the Apify Console under its **Settings** tab. New Actors are configured to use limited permissions by default. Older Actors might still use full permissions until you update their configuration.
@@ -66,7 +66,7 @@ When possible, design your Actors to use limited permissions and request only th
:::
-### Accessing user provided storages
+### Access user provided storages
By default, limited-permissions Actors can't access user storages. However, they can access storages that users explicitly provide via the Actor input. To do so, use the input schema to add a storage picker and declare exactly which operations your Actor needs.
diff --git a/sources/platform/actors/development/quick-start/build_with_ai.md b/sources/platform/actors/development/quick-start/build_with_ai.md
index 5b55c5f1d0..ff365e6557 100644
--- a/sources/platform/actors/development/quick-start/build_with_ai.md
+++ b/sources/platform/actors/development/quick-start/build_with_ai.md
@@ -32,7 +32,7 @@ The prompt guides AI coding assistants such as Antigravity, Cursor, Claude Code
1. Create directory: `mkdir my-new-actor`
1. Open the directory in _Antigravity_, _Cursor_, _Claude Code_, _VS Code with GitHub Copilot_, etc.
1. Copy the prompt above and paste it into your AI coding assistant (Agent or Chat)
-1. Run it, and develop your first actor with the help of AI
+1. Run it, and develop your first Actor with the help of AI
:::info Avoid copy-pasting
@@ -52,9 +52,9 @@ If you do not have Apify CLI installed, see the [installation guide](/cli/docs/i
The command above will guide you through Apify Actor initialization, where you select an Actor Template that works for you. The result is an initialized Actor (with AGENTS.md) ready for development.
-## Use Apify MCP Server
+## Use Apify MCP server
-The Apify MCP Server has tools to search and fetch documentation. If you set it up in your AI editor, it will help you improve the generated code by providing additional context to the AI.
+The Apify MCP server has tools to search and fetch documentation. If you set it up in your AI editor, it will help you improve the generated code by providing additional context to the AI.
:::info Use Apify MCP server configuration
diff --git a/sources/platform/actors/development/quick-start/start_locally.md b/sources/platform/actors/development/quick-start/start_locally.md
index a107d14db2..52a0f7ffd9 100644
--- a/sources/platform/actors/development/quick-start/start_locally.md
+++ b/sources/platform/actors/development/quick-start/start_locally.md
@@ -143,7 +143,7 @@ Let's now deploy your Actor to the Apify platform, where you can run the Actor o
apify push
```
-### Step 5: It's Time to Iterate!
+### Step 5: It's time to iterate!
Good job! 🎉 You're ready to develop your Actor. You can make changes to your Actor and implement your use case.
diff --git a/sources/platform/actors/index.mdx b/sources/platform/actors/index.mdx
index bfd6843002..078f25eed6 100644
--- a/sources/platform/actors/index.mdx
+++ b/sources/platform/actors/index.mdx
@@ -6,7 +6,7 @@ category: platform
slug: /actors
---
-**Learn how to run, develop, and publish Apify Actors — serverless cloud programs for web data extraction and workflow automation.**
+**Learn how to run, develop, and publish Apify Actors - serverless cloud programs for web data extraction and workflow automation.**
import Card from "@site/src/components/Card";
import CardGrid from "@site/src/components/CardGrid";
@@ -61,7 +61,7 @@ Build Actors to automate tasks, scrape data, or create custom workflows. The Api
Ready to start? Check out the [Actor development documentation](/platform/actors/development).
-## Running Actors
+## Run Actors
You can run Actors manually in [Apify Console](https://console.apify.com/actors), using the [API](/api), [CLI](/cli), or [scheduler](../schedules.md). You can easily [integrate Actors](../integrations/index.mdx) with other apps, [share](../collaboration/access_rights.md) them with other people, [publish](./publishing/index.mdx) them in [Apify Store](https://apify.com/store), and even [monetize](./publishing/monetize/index.mdx).
diff --git a/sources/platform/actors/publishing/index.mdx b/sources/platform/actors/publishing/index.mdx
index b781ddec57..5afccca102 100644
--- a/sources/platform/actors/publishing/index.mdx
+++ b/sources/platform/actors/publishing/index.mdx
@@ -57,7 +57,7 @@ To ensure long-term quality and improve your chances of successfully monetizing
If you decide to make your Actor's code publicly available on [GitHub](https://github.com), code quality becomes even more crucial, as your Actor may be the first experience some users have with Apify.
-### Handling breaking changes
+### Handle breaking changes
While refactoring and updating your Actor's code is encouraged, be cautious of making changes that could break the Actor for existing users. If you plan to introduce breaking change, please contact us at [community@apify.com](mailto:community@apify.com) beforehand, and we'll assist you in communicating the change to your users.
diff --git a/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx b/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx
index d58140917c..91f42fe9b7 100644
--- a/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx
+++ b/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx
@@ -64,7 +64,7 @@ While optional, we recommend offering progressively lower prices for higher disc
Your platform costs are also lower for these higher tier, which helps maintain healthy profit margins. This is further detailed in the [Computing your costs for PPE and PPR Actors](#computing-your-costs-for-ppe-and-ppr-actors) section.
-## Implementing discount tiers
+## Implement discount tiers
By default, we advise against setting excessively high prices for _FREE_ tier users, as this can limit the ability to evaluate your Actor thoroughly. However, in certain situations, such as protecting your Actor from fraudulent activity or excessive use of your internal APIs, a higher price for _FREE_ tier users might be justified.
diff --git a/sources/platform/actors/running/index.md b/sources/platform/actors/running/index.md
index 88be596a39..d5b103f52c 100644
--- a/sources/platform/actors/running/index.md
+++ b/sources/platform/actors/running/index.md
@@ -54,7 +54,7 @@ And you can use the export button at the bottom left to export the data in multi
And that's it! Now you can get back to the Actor's input, play with it, and try out more of the [Apify Actors](https://apify.com/store) or [build your own](./development).
-## Running via Apify API
+## Run via Apify API
Actors can also be invoked using the Apify API by sending an HTTP POST request to the [Run Actor](/api/v2/#/reference/actors/run-collection/run-actor) endpoint, such as:
@@ -66,7 +66,7 @@ An Actor's input and its content type can be passed as a payload of the POST req
> To learn more about this, read the [Run an Actor or task and retrieve data via API](/academy/api/run-actor-and-retrieve-data-via-api) tutorial.
-## Running programmatically
+## Run programmatically
Actors can also be invoked programmatically from your own applications or from other Actors.
diff --git a/sources/platform/actors/running/input_and_output.md b/sources/platform/actors/running/input_and_output.md
index 05f5a441ac..07186674de 100644
--- a/sources/platform/actors/running/input_and_output.md
+++ b/sources/platform/actors/running/input_and_output.md
@@ -41,7 +41,7 @@ As part of the input, you can also specify run options such as [Build](../develo
:::info Dynamic memory
-If the Actor is configured by developer to use [dynamic memory](../development/actor_definition/dynamic_actor_memory/index.md), the system will calculate the optimal memory allocation based on your input. In this case, the **Memory** option acts as an override — if you set it, the calculated value will be ignored.
+If the Actor is configured by developer to use [dynamic memory](../development/actor_definition/dynamic_actor_memory/index.md), the system will calculate the optimal memory allocation based on your input. In this case, the **Memory** option acts as an override - if you set it, the calculated value will be ignored.
:::
diff --git a/sources/platform/actors/running/runs_and_builds.md b/sources/platform/actors/running/runs_and_builds.md
index 9accc07dae..52d81a0fdf 100644
--- a/sources/platform/actors/running/runs_and_builds.md
+++ b/sources/platform/actors/running/runs_and_builds.md
@@ -88,8 +88,8 @@ flowchart LR
| FAILED | terminal | Run failed |
| TIMING-OUT | transitional | Timing out now |
| TIMED-OUT | terminal | Timed out |
-| ABORTING | transitional | Being aborted by the user |
-| ABORTED | terminal | Aborted by the user |
+| ABORTING | transitional | Being aborted by the user |
+| ABORTED | terminal | Aborted by the user |
### Aborting runs
@@ -127,6 +127,6 @@ Apify securely stores your ten most recent runs indefinitely, ensuring your reco
**Actor builds** are deleted only when they are _not tagged_ and have not been used for over 90 days.
-## Sharing
+## Share
Share your Actor runs with other Apify users via the [access rights](../../collaboration/index.md) system.
diff --git a/sources/platform/actors/running/store.md b/sources/platform/actors/running/store.md
index a79b426235..542e2ed8f5 100644
--- a/sources/platform/actors/running/store.md
+++ b/sources/platform/actors/running/store.md
@@ -25,7 +25,7 @@ All Actors in [Apify Store](https://apify.com/store) fall into one of the four p
1. [**Rental**](#rental-actors) - to continue using the Actor after the trial period, you must rent the Actor from the developer and pay a flat monthly fee in addition to the costs associated with the platform usage that the Actor generates.
2. [**Pay per result**](#pay-per-result) - you do not pay for platform usage the Actor generates and instead just pay for the results it produces.
-3. [**Pay per event**](#pay-per-event) - you pay for specific events the Actor creator defines, such as generating a single result or starting the Actor. Most Actors include platform usage in the price, but some may charge it separately — check the Actor's pricing for details.
+3. [**Pay per event**](#pay-per-event) - you pay for specific events the Actor creator defines, such as generating a single result or starting the Actor. Most Actors include platform usage in the price, but some may charge it separately - check the Actor's pricing for details.
4. [**Pay per usage**](#pay-per-usage) - you can run the Actor and you pay for the platform usage the Actor generates.
### Rental Actors
@@ -39,7 +39,7 @@ Most rental Actors have a _free trial_ period. The length of the trial is displa
After a trial period, a flat monthly _Actor rental_ fee is automatically subtracted from your prepaid platform usage in advance for the following month. Most of this fee goes directly to the developer and is paid on top of the platform usage generated by the Actor. You can read more about our motivation for releasing rental Actors in [this blog post](https://blog.apify.com/make-regular-passive-income-developing-web-automation-actors-b0392278d085/) from Apify's CEO Jan Čurn.
-#### Rental Actors - Frequently Asked Questions
+#### Rental Actors - Frequently asked questions
##### Can I run rental Actors via API or the Apify client?
@@ -92,7 +92,7 @@ This makes it transparent and easy to estimate upfront costs. If you have any fe
-#### Pay per result Actors - Frequently Asked Questions
+#### Pay per result Actors - Frequently asked questions
##### How do I know an Actor is paid per result?
@@ -106,7 +106,7 @@ No, the Actor is free to run. You only pay for the results.
##### What happens when I interact with the dataset after the run finishes?
-Under the **pay per result** model, all platform costs generated _during the run of an Actor_ are not charged towards your account; you pay for the results instead. After the run finishes, any interactions with the default dataset storing the results, such as reading the results or writing additional data, will incur the standard platform usage costs. But do not worry, in the vast majority of cases, you only want to read the result from the dataset and that costs near to nothing.
+Under the **pay per result** model, all platform costs generated _during the run of an Actor_ are not charged towards your account; you pay for the results instead. After the run finishes, any interactions with the default dataset storing the results, such as reading the results or writing additional data, will incur the standard platform usage costs. But do not worry - in the vast majority of cases, you only want to read the result from the dataset and that costs near to nothing.
##### Do I pay for the storage of results on the Apify platform?
@@ -149,9 +149,9 @@ Most pay per event Actors include platform usage in the event price. However, so
:::
-#### Pay per event Actors - Frequently Asked Questions
+#### Pay per event Actors - Frequently asked questions
-#### How do I know Actor is paid per events?
+#### How do I know an Actor is paid per events?
You will see that the Actor is paid per events next to the Actor name.
@@ -173,7 +173,7 @@ You would still pay for the long term storage of results, same as for pay per re
#### Do I need to pay for platform usage with pay per event Actors?
-In most cases, no — the majority of pay per event Actors include [platform usage](./usage_and_resources.md) in the event price, so you only pay for the events. However, some Actors may charge platform usage separately, in addition to the event costs. Always check the pricing section on the Actor's page—it clearly states whether platform usage is included or not.
+In most cases, no - the majority of pay per event Actors include [platform usage](./usage_and_resources.md) in the event price, so you only pay for the events. However, some Actors may charge platform usage separately, in addition to the event costs. Always check the pricing section on the Actor's page - it clearly states whether platform usage is included or not.

@@ -211,7 +211,7 @@ _For more information on platform usage cost see the [usage and resources](./usa
:::
-## Reporting issues with Actors
+## Report issues with Actors
Each Actor has an **Issues** tab in Apify Console. There, you can open an issue (ticket) and chat with the Actor's author, platform admins,
and other users of this Actor. Please feel free to use the tab to ask any questions, request new features, or give feedback. Alternatively, you can
diff --git a/sources/platform/collaboration/general-resource-access.md b/sources/platform/collaboration/general-resource-access.md
index 5e3d695b74..a5d4e38874 100644
--- a/sources/platform/collaboration/general-resource-access.md
+++ b/sources/platform/collaboration/general-resource-access.md
@@ -24,7 +24,7 @@ This setting affects the following resources:
- Key-value stores
- Request queues
-Access to resources that require explicit access — such as Actors, tasks or schedules are not affected by this setting.
+Access to resources that require explicit access - such as Actors, tasks or schedules are not affected by this setting.

@@ -60,9 +60,9 @@ Even if your access is set to **Restricted** there are a few built-in exceptions
#### Builds of public Actors
-Builds of public Actors are always accessible to anyone who can view the Actor — regardless of the Actor owner’s account **General resource access** setting.
+Builds of public Actors are always accessible to anyone who can view the Actor - regardless of the Actor owner's account **General resource access** setting.
-This ensures that public Actors in Apify Store continue to work as expected. For example, if you open a public Actor in Console, you’ll also be able to view its build details, download logs, or inspect the source package — without needing extra permissions or a token.
+This ensures that public Actors in Apify Store continue to work as expected. For example, if you open a public Actor in Console, you'll also be able to view its build details, download logs, or inspect the source package - without needing extra permissions or a token.
This exception exists to maintain usability and avoid breaking workflows that rely on public Actors. It only applies to builds of Actors that are marked as **public**. For private Actors, build access still follows the general resource access setting of the owner’s account.
@@ -73,7 +73,7 @@ When you share an Actor with a collaborator, you can choose to share read-only a
- This access includes logs, input, and default storages (dataset, key-value store, request queue)
- Access is one-way: you won’t see the collaborator’s runs unless they share them
- Collaborators can’t see each other’s runs
-- This works even if your account uses **restricted general resource access** — permissions are applied automatically.
+- This works even if your account uses **restricted general resource access** - permissions are applied automatically.
#### Automatically sharing runs with public Actor creators
@@ -83,13 +83,13 @@ If you’re using a public Actor from Apify Store, you can choose to automatical
- When enabled, your runs of public Actors are automatically visible to the Actor’s creator
- Shared runs include logs, input, and output storages (dataset, key-value store, request queue)
-This sharing works even if your account has **General resource access** set to **Restricted** — the platform applies specific permission checks to ensure the Actor creator can access only the relevant runs.
+This sharing works even if your account has **General resource access** set to **Restricted** - the platform applies specific permission checks to ensure the Actor creator can access only the relevant runs.
You can disable this behavior at any time by turning off the setting in your account.
#### Automatically sharing runs via Actor Issues
-When you report an issue on an Actor and include a **run URL**, that run is automatically shared with the Actor developer — **even if your account uses restricted general resource access**.
+When you report an issue on an Actor and include a **run URL**, that run is automatically shared with the Actor developer - **even if your account uses restricted general resource access**.
This automatic sharing ensures the developer can view all the context they need to troubleshoot the issue effectively. That includes:
@@ -101,13 +101,13 @@ This automatic sharing ensures the developer can view all the context they need
The access is granted through explicit, behind-the-scenes permissions (not anonymous or public access), and is limited to just that run and its related storages. No other resources in your account are affected.
-This means you don’t need to manually adjust permissions or share multiple links when reporting an Actor issue — **just including the run URL in your issue is enough**
+This means you don't need to manually adjust permissions or share multiple links when reporting an Actor issue - **just including the run URL in your issue is enough**

## Per-resource access control
-The account level access control can be changed on individual resources. This can be done by setting the general access level to other than Restricted in the share dialog for a given resource. This way the resource level setting takes precedence over the account setting.
+The account level access control can be changed on individual resources. This can be done by setting the general access level to other than Restricted in the share dialog for a given resource. This way the resource level setting takes precedence over the account setting.

@@ -125,7 +125,7 @@ await datasetClient.update({
### Sharing restricted resources with pre-signed URLs {#pre-signed-urls}
-Even when a resource is restricted, you might still want to share it with someone outside your team — for example, to send a PDF report to a client, or include a screenshot in an automated email or Slack message. In these cases, _storage resources_ (like key-value stores, datasets, and request queues) support generating _pre-signed URLs_. These are secure, time-limited links that let others access individual files without needing an Apify account or authentication.
+Even when a resource is restricted, you might still want to share it with someone outside your team - for example, to send a PDF report to a client, or include a screenshot in an automated email or Slack message. In these cases, _storage resources_ (like key-value stores, datasets, and request queues) support generating _pre-signed URLs_. These are secure, time-limited links that let others access individual files without needing an Apify account or authentication.
#### How pre-signed URLs work
@@ -237,7 +237,7 @@ If the `expiresInSecs` option is not specified, the generated link will be _perm
#### Signing URLs manually
-If you need finer control — for example, generating links without using Apify client — you can sign URLs manually using our reference implementation.
+If you need finer control - for example, generating links without using Apify client - you can sign URLs manually using our reference implementation.
[Check the reference implementation in Apify clients](https://github.com/apify/apify-client-js/blob/5efd68a3bc78c0173a62775f79425fad78f0e6d1/src/resource_clients/dataset.ts#L179)
@@ -301,7 +301,7 @@ const recordUrl = `https://api.apify.com/v2/key-value-stores/${storeId}/records/
const storeClient = client.keyValueStore(storeId);
const recordUrl = await storeClient.getRecordPublicUrl(recordKey);
-// Save pre-signed URL — accessible without authentication
+// Save pre-signed URL - accessible without authentication
await Actor.pushData({ recordUrl });
```
@@ -324,7 +324,7 @@ You can easily test this by switching your own account’s setting to _Restricte
:::tip Make sure links work as expected
-Once you’ve enabled restricted access, run your Actor and confirm that all links generated in logs, datasets, key-value stores, and status messages remain accessible as expected. Make sure any shared URLs — especially those stored in results or notifications — work without requiring an API token.
+Once you've enabled restricted access, run your Actor and confirm that all links generated in logs, datasets, key-value stores, and status messages remain accessible as expected. Make sure any shared URLs - especially those stored in results or notifications - work without requiring an API token.
:::
diff --git a/sources/platform/collaboration/organization_account/how_to_use.md b/sources/platform/collaboration/organization_account/how_to_use.md
index ee54ae198b..3f78448e4a 100644
--- a/sources/platform/collaboration/organization_account/how_to_use.md
+++ b/sources/platform/collaboration/organization_account/how_to_use.md
@@ -22,7 +22,7 @@ You can switch into **Organization account** view using the account button in th

-In the menu, the account you are currently using is displayed at the top, with all the accounts you can switch to displayed below. When you need to get back to your personal account, you can just switch right back to it—no need to log in and out.
+In the menu, the account you are currently using is displayed at the top, with all the accounts you can switch to displayed below. When you need to get back to your personal account, you can just switch right back to it - no need to log in and out.
The resources you can access and account details you can edit will depend on your [permissions](../list_of_permissions.md) in the organization.
diff --git a/sources/platform/console/index.md b/sources/platform/console/index.md
index 78293381f6..b740c203fd 100644
--- a/sources/platform/console/index.md
+++ b/sources/platform/console/index.md
@@ -24,7 +24,7 @@ This is the most common way of creating an account. You just need to provide you
After you click the **Sign up** button, we will send you a verification email. The email contains a link that you need to click on or copy to your browser to proceed to automated email verification. After we verify your email, you will proceed to Apify Console.
:::info CAPTCHA
-We are using Google reCaptcha to prevent spam accounts. Usually, you will not see it, but if Google evaluates your browser as suspicious, they will ask you to solve a reCaptcha before we create your account and send you the verification email.
+We are using Google reCAPTCHA to prevent spam accounts. Usually, you will not see it, but if Google evaluates your browser as suspicious, they will ask you to solve a reCAPTCHA before we create your account and send you the verification email.
:::
If you did not receive the email, you can visit the [sign-in page](https://console.apify.com/sign-in). There, you will either proceed to our verification page right away, or you can sign in and will be redirected afterward. On the verification page, you can click on the **Resend verification email** button to send the email again.
@@ -56,13 +56,13 @@ In case you forgot your password, you can click on the **Forgot your password?**

-## Adding different authentication methods
+## Add different authentication methods
After you create your account, you might still want to use the other authentication methods. To do that, go to the [Login & Privacy](https://console.apify.com/settings/security) section of your account settings. There, you will see all available authentication methods and their configuration.

-## Resetting your password
+## Reset your password
This section also allows you to reset your password if you ever forget it. To do that, click the **Send email to reset password** button.
We will then send an email to the address connected to your account with a link to the password reset page.
@@ -94,34 +94,34 @@ You can also navigate Apify Console via keyboard shortcuts.
Keyboard Shortcuts
-|Shortcut| Tab |
-|:---|:----|
-|Show shortcuts | Shift? |
-|Home| GH |
-|Store| GO |
-|Actors| GA |
-|Development| GD |
-|Saved tasks| GT |
-|Runs| GR |
-|Integrations | GI |
-|Schedules| GU |
-|Storage| GE |
-|Proxy| GP |
-|Settings| GS |
-|Billing| GB |
+| Shortcut | Tab |
+| :--- | :--- |
+| Show shortcuts | Shift? |
+| Home | GH |
+| Store | GO |
+| Actors | GA |
+| Development | GD |
+| Saved tasks | GT |
+| Runs | GR |
+| Integrations | GI |
+| Schedules | GU |
+| Storage | GE |
+| Proxy | GP |
+| Settings | GS |
+| Billing | GB |
| Tab name | Description |
-|:---|:---|
-| [Apify Store](/platform/console/store)| Search for Actors that suit your web-scraping needs. |
-| [Actors](/platform/actors)| View recent & bookmarked Actors. |
-| [Runs](/platform/actors/running/runs-and-builds)| View your recent runs. |
-| [Saved tasks](/platform/actors/running/tasks)| View your saved tasks. |
-| [Schedules](/platform/schedules)| Schedule Actor runs & tasks to run at specified time. |
-| [Integrations](/platform/integrations)| View your integrations. |
-| [Development](/platform/actors/development)| • My Actors - See Actors developed by you.
• Insights - see analytics for your Actors.
• Messaging - check on issues reported in your Actors or send emails to users of your Actors. |
-| [Proxy](/platform/proxy)| View your proxy usage & credentials |
-| [Storage](/platform/storage)| View stored results of your runs in various data formats. |
-| [Billing](/platform/console/billing)| Billing information, statistics and invoices. |
-| [Settings](/platform/console/settings)| Settings of your account. |
+| :--- | :--- |
+| [Apify Store](/platform/console/store) | Search for Actors that suit your web-scraping needs. |
+| [Actors](/platform/actors) | View recent & bookmarked Actors. |
+| [Runs](/platform/actors/running/runs-and-builds) | View your recent runs. |
+| [Saved tasks](/platform/actors/running/tasks) | View your saved tasks. |
+| [Schedules](/platform/schedules) | Schedule Actor runs & tasks to run at specified time. |
+| [Integrations](/platform/integrations) | View your integrations. |
+| [Development](/platform/actors/development) | • My Actors - See Actors developed by you.
• Insights - see analytics for your Actors.
• Messaging - check on issues reported in your Actors or send emails to users of your Actors. |
+| [Proxy](/platform/proxy) | View your proxy usage & credentials |
+| [Storage](/platform/storage) | View stored results of your runs in various data formats. |
+| [Billing](/platform/console/billing) | Billing information, statistics and invoices. |
+| [Settings](/platform/console/settings) | Settings of your account. |
diff --git a/sources/platform/console/two-factor-authentication.md b/sources/platform/console/two-factor-authentication.md
index 0061735f77..75f8f88e55 100644
--- a/sources/platform/console/two-factor-authentication.md
+++ b/sources/platform/console/two-factor-authentication.md
@@ -14,7 +14,7 @@ If you use your email and password to sign in to Apify Console, you can enable t
Some organizations might require two-factor authentication (2FA) to access their resources. Members of such an organization, must enable 2FA on their account in order to continue accessing shared resources and maintain compliance with their security policies.
-## Setting up two-factor authentication
+## Set up two-factor authentication
To set up two-factor authentication, go to the [Login & Privacy](https://console.apify.com/settings/security) section of your account settings. There, look for the **Two-factor authentication** section. Currently, there is only one option, which is the **Authenticator app**. If you have two-factor authentication already enabled, there will be a label **enabled** next to it.
@@ -36,7 +36,7 @@ A new pop-up window will appear where you can copy the two-factor `secret` key,
After you scan the QR code or set up your app manually, the app will generate a code that you need to enter into the **Verify the code from the app** field. After you enter the code, click on the **Continue** button to get to the next step of the setup process.
-### Recovery settings
+### Set up recovery settings

@@ -63,7 +63,7 @@ After you enable two-factor authentication, the next time you attempt to sign in

-## Using recovery codes
+## Use recovery codes
In case you lose access to your authenticator app, you can use the recovery codes to sign in to your account. To do that, click on the **recovery code or begin 2FA account recovery** link below the **Verify** button. This will redirect you to a view similar to the current one, but instead of code from the authenticator app, you will need to enter one of the 16 recovery codes you received during the setup process.
@@ -76,7 +76,7 @@ When you successfully use a recovery code, we remove the code from the original

-## Disabling two-factor authentication
+## Disable two-factor authentication
If you no longer want to use the two-factor authentication or lose access to your authenticator app, you can disable the two-factor authentication in the [Login & Privacy](https://console.apify.com/settings/security) section of your. See the **Two-factor authentication** section and click on the **Disable** button. We will ask you to enter either your verification code from the authenticator app or, if you do not have access to it anymore, you can use one of your recovery codes. After entering the code, click on the **Remove app** button to verify the provided code. If it's valid, it will disable the two-factor authentication and remove the configuration from your account.
@@ -90,7 +90,7 @@ If you lose access to your authenticator app and do not have any recovery codes
For our support team to help you recover your account, you will need to provide them with the personal information you have configured during the two-factor authentication setup. If you provide the correct information, the support team will help you regain access to your account.
-:::caution
+:::caution Support verification
The support team will not give you any clues about the information you provided; they will only verify if it is correct.
:::
diff --git a/sources/platform/index.mdx b/sources/platform/index.mdx
index 63eab3a5f8..8a64ceda9e 100644
--- a/sources/platform/index.mdx
+++ b/sources/platform/index.mdx
@@ -13,7 +13,7 @@ import homepageContent from "./homepage_content.json";
**Apify** is a cloud platform and marketplace for web data extraction and automation tools called **Actors**.
-## Getting started
+## Get started
Learn how to run any Actor in Apify Store or create your own. A step-by-step guides through your first steps on the Apify platform.
diff --git a/sources/platform/integrations/actors/index.md b/sources/platform/integrations/actors/index.md
index 18435148fd..a639ccc7aa 100644
--- a/sources/platform/integrations/actors/index.md
+++ b/sources/platform/integrations/actors/index.md
@@ -37,7 +37,7 @@ This leads you to a setup screen, where you can provide:
- **Input for the integrated Actor**: Typically, the input has two parts. The information that is independent of the run triggering it and information that is specific for that run. The "independent" information (e.g. connection string to database or table name) can be added to the input as is. The information specific to the run (e.g. dataset ID) is either obtained from the implicit `payload` field (this is the case for most Actors that are integration-ready), or they can be provided using variables.
- **Available variables** are the same ones as in webhooks. The one that you probably are going to need the most is `{{resource}}`, which is the Run object in the same shape you get from the [API](/api/v2/actor-run-get) (for build event types, it will be the Build object). The variables can make use of dot notation, so you will most likely just need `{{resource.defaultDatasetId}}` or `{{resource.defaultKeyValueStoreId}}`.
-## Testing your integration
+## Test your integration
When adding a new integration, you can test it using a past run or build as a trigger. This will trigger a run of your target Actor or task as if your desired trigger event just occurred. The only difference between a test run and regular run is that the trigger's event type will be set to 'TEST'. The test run will still consume compute units.
diff --git a/sources/platform/integrations/actors/integration_ready_actors.md b/sources/platform/integrations/actors/integration_ready_actors.md
index b4f0e66fa2..4db22d6adb 100644
--- a/sources/platform/integrations/actors/integration_ready_actors.md
+++ b/sources/platform/integrations/actors/integration_ready_actors.md
@@ -88,7 +88,7 @@ const datasetIdToProcess = datasetId || payload?.resource?.defaultDatasetId;
In the above example, we're focusing on accessing a run's default dataset, but the approach would be similar for any other field.
-## Making your Actor available to other users
+## Make your Actor available to other users
To allow other users to use your Actor as an integration, all you need to do is [publish it in Apify Store](/platform/actors/publishing), so users can then integrate it using the **Connect Actor or task** button on the **Integrations** tab of any Actor. While publishing the Actor is enough, there are two ways to make it more visible to users.
diff --git a/sources/platform/integrations/ai/agno.md b/sources/platform/integrations/ai/agno.md
index b0bc22d5d1..802c5dd882 100644
--- a/sources/platform/integrations/ai/agno.md
+++ b/sources/platform/integrations/ai/agno.md
@@ -26,7 +26,7 @@ This guide shows how to integrate Apify Actors with Agno to empower your AI agen
### Prerequisites
-- _Apify API token_: Obtain your API token from the [Apify console](https://console.apify.com/account/integrations).
+- _Apify API token_: Obtain your API token from the [Apify Console](https://console.apify.com/account/integrations).
- _OpenAI API key_: Get your API key from the [OpenAI platform](https://platform.openai.com/account/api-keys).
:::tip Alternative LLM providers
diff --git a/sources/platform/integrations/ai/aws_bedrock.md b/sources/platform/integrations/ai/aws_bedrock.md
index 04d3cf9ccb..6ff61c2936 100644
--- a/sources/platform/integrations/ai/aws_bedrock.md
+++ b/sources/platform/integrations/ai/aws_bedrock.md
@@ -42,7 +42,7 @@ The following image illustrates the key components of an AWS Bedrock AI agent:

-### Building an Agent
+### Build an Agent
To begin, open the Amazon Bedrock console and select agents from the left navigation panel.
On the next screen, click Create agent to start building your agent.
diff --git a/sources/platform/integrations/ai/chatgpt.md b/sources/platform/integrations/ai/chatgpt.md
index 9ab1176c99..6311a92866 100644
--- a/sources/platform/integrations/ai/chatgpt.md
+++ b/sources/platform/integrations/ai/chatgpt.md
@@ -64,8 +64,8 @@ Once your connector is ready:
> “Search the web and summarize recent trends in AI agents”
-You’ll need to grant permission for each Apify tool when it’s used for the first time.
-You should see ChatGPT calling Apify tools — such as the [RAG Web Browser](https://apify.com/apify/rag-web-browser) — to gather information.
+You'll need to grant permission for each Apify tool when it's used for the first time.
+You should see ChatGPT calling Apify tools - such as the [RAG Web Browser](https://apify.com/apify/rag-web-browser) - to gather information.

diff --git a/sources/platform/integrations/ai/crewai.md b/sources/platform/integrations/ai/crewai.md
index 1e7f6336a4..c248bd3983 100644
--- a/sources/platform/integrations/ai/crewai.md
+++ b/sources/platform/integrations/ai/crewai.md
@@ -12,7 +12,7 @@ slug: /integrations/crewai
## What is CrewAI
-[CrewAI](https://www.crewai.com/) is an open-source Python framework designed to orchestrate autonomous, role-playing AI agents that collaborate as a "crew" to tackle complex tasks. It enables developers to define agents with specific roles, assign tasks, and integrate tools—like Apify Actors—for real-world data retrieval and automation.
+[CrewAI](https://www.crewai.com/) is an open-source Python framework designed to orchestrate autonomous, role-playing AI agents that collaborate as a "crew" to tackle complex tasks. It enables developers to define agents with specific roles, assign tasks, and integrate tools - like Apify Actors - for real-world data retrieval and automation.
:::note Explore CrewAI
@@ -34,7 +34,7 @@ This guide demonstrates how to integrate Apify Actors with CrewAI by building a
pip install 'crewai[tools]' langchain-apify langchain-openai
```
-### Building the TikTok profile search and analysis crew
+### Build the TikTok profile search and analysis crew
First, import all required packages:
diff --git a/sources/platform/integrations/ai/flowise.md b/sources/platform/integrations/ai/flowise.md
index fbffcfa60f..97f04edbb4 100644
--- a/sources/platform/integrations/ai/flowise.md
+++ b/sources/platform/integrations/ai/flowise.md
@@ -36,7 +36,7 @@ It will be available on `https://localhost:3000`
Other methods of using Flowise can be found in their [documentation](https://docs.flowiseai.com/getting-started#quick-start)
-### Building your flow
+### Build your flow
After running Flowise, you can start building your flow with Apify.
diff --git a/sources/platform/integrations/ai/google-adk.md b/sources/platform/integrations/ai/google-adk.md
index 168c2e04e4..46e47f59a2 100644
--- a/sources/platform/integrations/ai/google-adk.md
+++ b/sources/platform/integrations/ai/google-adk.md
@@ -97,6 +97,6 @@ Find a pub near the Ferry Building in San Francisco.
- [Apify Actors](https://docs.apify.com/platform/actors)
- [Google ADK documentation](https://google.github.io/adk-docs/get-started/)
- [What are AI agents?](https://blog.apify.com/what-are-ai-agents/)
-- [Apify MCP Server](https://mcp.apify.com)
-- [Apify MCP Server documentation](https://docs.apify.com/platform/integrations/mcp)
+- [Apify MCP server](https://mcp.apify.com)
+- [Apify MCP server documentation](https://docs.apify.com/platform/integrations/mcp)
- [Apify OpenRouter proxy](https://apify.com/apify/openrouter)
diff --git a/sources/platform/integrations/ai/langflow.md b/sources/platform/integrations/ai/langflow.md
index d0f0852269..c195dd2dcf 100644
--- a/sources/platform/integrations/ai/langflow.md
+++ b/sources/platform/integrations/ai/langflow.md
@@ -57,14 +57,14 @@ When the platform is started, open the Langflow UI using `http://127.0.0.1:7860`
> Other installation methods can be found in the [Langflow documentation](https://docs.langflow.org/get-started-installation).
-### Creating a new flow
+### Create a new flow
On the Langflow welcome screen, click the **New Flow** button and then create **Blank Flow**:

Now, you can start building your flow.
-### Calling Apify Actors in Langflow
+### Call Apify Actors in Langflow
To call Apify Actors in Langflow, you need to add the **Apify Actors** component to the flow.
From the bundle menu, add **Apify Actors** component:
@@ -98,7 +98,7 @@ When you run the component again, the output contains only the `markdown` and fl
Now that you understand how to call Apify Actors, let's build a practical example where you search for a company's social media profiles and extract data from them.
-### Building a flow to search for a company's social media profiles
+### Build a flow to search for a company's social media profiles
Create a new flow and add two **Apify Actors** components from the menu.
diff --git a/sources/platform/integrations/ai/langgraph.md b/sources/platform/integrations/ai/langgraph.md
index 27d76d9e41..cd7be22418 100644
--- a/sources/platform/integrations/ai/langgraph.md
+++ b/sources/platform/integrations/ai/langgraph.md
@@ -36,7 +36,7 @@ This guide will demonstrate how to use Apify Actors with LangGraph by building a
pip install langgraph langchain-apify langchain-openai
```
-### Building the TikTok profile search and analysis agent
+### Build the TikTok profile search and analysis agent
First, import all required packages:
diff --git a/sources/platform/integrations/ai/lindy.md b/sources/platform/integrations/ai/lindy.md
index 6275507d56..42155e6623 100644
--- a/sources/platform/integrations/ai/lindy.md
+++ b/sources/platform/integrations/ai/lindy.md
@@ -58,7 +58,7 @@ You have access to thousands of Actors available on the [Apify Store](https://ap
This establishes the fundamental workflow:
_Chatting with Lindy can now trigger the Apify Instagram Profile Scraper._
-### Extending Your Workflow
+### Extend your workflow
Lindy offers different triggers (e.g., _email received_, _Slack message received_, etc.) and actions beyond running an Actor.
diff --git a/sources/platform/integrations/ai/mastra.md b/sources/platform/integrations/ai/mastra.md
index 7c666d73d2..ad71ba2929 100644
--- a/sources/platform/integrations/ai/mastra.md
+++ b/sources/platform/integrations/ai/mastra.md
@@ -6,7 +6,7 @@ sidebar_position: 11
slug: /integrations/mastra
---
-**Learn how to build AI agents with Mastra and Apify Actors MCP Server.**
+**Learn how to build AI agents with Mastra and Apify Actors MCP server.**
---
@@ -22,7 +22,7 @@ Check out the [Mastra docs](https://mastra.ai/docs) for more information.
## What is MCP server
-A [Model Context Protocol](https://modelcontextprotocol.io) (MCP) server exposes specific data sources or tools to agents via a standardized protocol. It acts as a bridge, connecting large language models (LLMs) to external systems like databases, APIs, or local filesystems. Built on a client-server architecture, MCP servers enable secure, real-time interaction, allowing agents to fetch context or execute actions without custom integrations. Think of it as a modular plugin system for agents, simplifying how they access and process data. Apify provides [Actors MCP Server](https://mcp.apify.com/) to expose [Apify Actors](https://docs.apify.com/platform/actors) from the [Apify Store](https://apify.com/store) as tools via the MCP protocol.
+A [Model Context Protocol](https://modelcontextprotocol.io) (MCP) server exposes specific data sources or tools to agents via a standardized protocol. It acts as a bridge, connecting large language models (LLMs) to external systems like databases, APIs, or local filesystems. Built on a client-server architecture, MCP servers enable secure, real-time interaction, allowing agents to fetch context or execute actions without custom integrations. Think of it as a modular plugin system for agents, simplifying how they access and process data. Apify provides [Actors MCP server](https://mcp.apify.com/) to expose [Apify Actors](https://docs.apify.com/platform/actors) from the [Apify Store](https://apify.com/store) as tools via the MCP protocol.
## How to use Apify with Mastra via MCP
@@ -39,7 +39,7 @@ This guide demonstrates how to integrate Apify Actors with Mastra by building an
npm install @mastra/core @mastra/mcp @ai-sdk/openai
```
-### Building the TikTok profile search and analysis agent
+### Build the TikTok profile search and analysis agent
First, import all required packages:
@@ -124,7 +124,7 @@ await mcpClient.disconnect();
:::note Use any Apify Actor
-Since it uses the [Apify MCP Server](https://mcp.apify.com), swap in any Apify Actor from the [Apify Store](https://apify.com/store) by updating the startup request’s `actors` parameter.
+Since it uses the [Apify MCP server](https://mcp.apify.com), swap in any Apify Actor from the [Apify Store](https://apify.com/store) by updating the startup request’s `actors` parameter.
No other changes are needed in the agent code.
:::
@@ -216,7 +216,7 @@ await mcpClient.disconnect();
- [Apify Actors](https://docs.apify.com/platform/actors)
- [Mastra Documentation](https://mastra.ai/docs)
-- [Apify MCP Server](https://mcp.apify.com)
+- [Apify MCP server](https://mcp.apify.com)
- [How to use MCP with Apify Actors](https://blog.apify.com/how-to-use-mcp/)
- [Apify Store](https://apify.com/store)
- [What are AI Agents?](https://blog.apify.com/what-are-ai-agents/)
diff --git a/sources/platform/integrations/ai/mcp.md b/sources/platform/integrations/ai/mcp.md
index 56719f774b..cd93057081 100644
--- a/sources/platform/integrations/ai/mcp.md
+++ b/sources/platform/integrations/ai/mcp.md
@@ -17,7 +17,7 @@ using [Model Context Protocol](https://modelcontextprotocol.io/docs/getting-star
discover and run Actors from [Apify Store](https://apify.com/store), access storages and results,
and enabled AI coding assistants to access Apify documentation and tutorials.
-
+
## Prerequisites
@@ -201,7 +201,7 @@ VS Code supports MCP through GitHub Copilot's agent mode (requires Copilot subsc
:::tip One-click installation
-Download and run the [Apify MCP Server `.mcpb` file](https://github.com/apify/actors-mcp-server/releases/latest/download/apify-mcp-server.mcpb) for one-click installation.
+Download and run the [Apify MCP server `.mcpb` file](https://github.com/apify/actors-mcp-server/releases/latest/download/apify-mcp-server.mcpb) for one-click installation.
:::
@@ -400,7 +400,7 @@ documentation queries. If you exceed this limit, you'll receive a `429` response
## Support and resources
-The Apify MCP Server is an open-source project. Report bugs, suggest features, or ask questions in the [GitHub repository](https://github.com/apify/apify-mcp-server/issues).
+The Apify MCP server is an open-source project. Report bugs, suggest features, or ask questions in the [GitHub repository](https://github.com/apify/apify-mcp-server/issues).
If you find this project useful, please star it on [GitHub](https://github.com/apify/apify-mcp-server) to show your support!
diff --git a/sources/platform/integrations/ai/openai_agents.md b/sources/platform/integrations/ai/openai_agents.md
index 1ec6ce8f26..277280da57 100644
--- a/sources/platform/integrations/ai/openai_agents.md
+++ b/sources/platform/integrations/ai/openai_agents.md
@@ -24,7 +24,7 @@ Before integrating Apify with OpenAI Agents SDK, you'll need:
pip install agents openai
```
-## Building a web search agent with Apify MCP
+## Build a web search agent with Apify MCP
You can connect to the Apify MCP server using streamable HTTP with Bearer token authentication. Use your Apify API token by setting the `Authorization: Bearer ` header in the MCP server configuration.
@@ -84,7 +84,7 @@ The agent may take some time (seconds or even minutes) to execute tool calls, es
:::
-### Using specific Actors
+### Use specific Actors
You can configure the Apify MCP server to expose specific Actors by including them in the URL query parameters. For example, to use an Instagram scraper:
diff --git a/sources/platform/integrations/ai/skyfire.md b/sources/platform/integrations/ai/skyfire.md
index 20e1f4a38c..cfc1be30ac 100644
--- a/sources/platform/integrations/ai/skyfire.md
+++ b/sources/platform/integrations/ai/skyfire.md
@@ -25,7 +25,7 @@ Keep in mind that agentic payments are an experimental feature and may undergo s
With Skyfire integration, agents can discover available Apify Actors, execute scraping and automation tasks, and pay for services using pre-funded Skyfire tokens, all without human intervention.
-## Using Skyfire with Apify MCP Server
+## Use Skyfire with Apify MCP server
The [Apify MCP server](https://docs.apify.com/platform/integrations/mcp) provides the simplest way for agents to access Apify's Actor library using Skyfire payments.
@@ -136,7 +136,7 @@ See which Actors [support agentic payments](#supported-actors).
When not pre-loading Actors, agents can discover suitable Actors dynamically using the search tools. The search automatically filters results to show only Actors that support agentic payments.
-## Using Skyfire with Apify API
+## Use Skyfire with Apify API
For direct API integration, you can use Skyfire PAY tokens to authenticate and pay for Actor runs.
@@ -154,7 +154,7 @@ Instead of using a traditional Apify API token, pass your Skyfire PAY token in t
skyfire-pay-id: YOUR_SKYFIRE_PAY_TOKEN
```
-### Running an Actor
+### Run an Actor
Make a standard Actor run request to the [run Actor endpoint](https://docs.apify.com/api/v2#/reference/actors/run-collection/run-actor), but include the Skyfire PAY token in the header.
diff --git a/sources/platform/integrations/ai/vercel-ai-sdk.md b/sources/platform/integrations/ai/vercel-ai-sdk.md
index 37a773b518..c446843267 100644
--- a/sources/platform/integrations/ai/vercel-ai-sdk.md
+++ b/sources/platform/integrations/ai/vercel-ai-sdk.md
@@ -34,7 +34,7 @@ Apify is a marketplace of ready-to-use web scraping and automation tools, AI age
npm install @modelcontextprotocol/sdk @openrouter/ai-sdk-provider ai
```
-### Building a simple pub search AI agent using Apify Google Maps scraper
+### Build a simple pub search AI agent using Apify Google Maps scraper
First, import all required packages:
@@ -106,6 +106,6 @@ await mcpClient.close();
- [Apify Actors](https://docs.apify.com/platform/actors)
- [Vercel AI SDK documentation](https://ai-sdk.dev/docs/introduction)
- [What are AI agents?](https://blog.apify.com/what-are-ai-agents/)
-- [Apify MCP Server](https://mcp.apify.com)
-- [Apify MCP Server documentation](https://docs.apify.com/platform/integrations/mcp)
+- [Apify MCP server](https://mcp.apify.com)
+- [Apify MCP server documentation](https://docs.apify.com/platform/integrations/mcp)
- [Apify OpenRouter proxy](https://apify.com/apify/openrouter)
diff --git a/sources/platform/integrations/data-storage/airtable/index.md b/sources/platform/integrations/data-storage/airtable/index.md
index d2551837ad..d16b82b90d 100644
--- a/sources/platform/integrations/data-storage/airtable/index.md
+++ b/sources/platform/integrations/data-storage/airtable/index.md
@@ -64,7 +64,7 @@ The extension provides the following capabilities:
### Run Actor
-1. Select any Actor from **Apify store** or **recently used Actors**
+1. Select any Actor from **Apify Store** or **recently used Actors**

1. Fill in the Actor input form.
@@ -90,7 +90,7 @@ Retrieve items from any Apify dataset and import them into your Airtable base wi
This section explains how to map your Actor run results or dataset items into your Airtable base.
-#### Understanding mapping rows
+#### Understand mapping rows
The Apify extension provides UI elements that allow you to map dataset fields to Airtable fields.
@@ -132,7 +132,7 @@ _How it works_: For a source field like `crawl.depth`, the extension checks for
To prevent duplicate records, select a **Unique ID** on the data mapping step. The unique ID is added to the list of mapping rows. Ensure it points to the correct field in your table. During import, the extension filters data by existing values in the table.

-#### Preview Mapped Data
+#### Preview mapped data
Preview the results and start the import
diff --git a/sources/platform/integrations/integrate_with_apify.md b/sources/platform/integrations/integrate_with_apify.md
index a99f6176e3..b4315e7ac5 100644
--- a/sources/platform/integrations/integrate_with_apify.md
+++ b/sources/platform/integrations/integrate_with_apify.md
@@ -34,7 +34,7 @@ Actor-specific integrations are designed for targeted use cases. While they work
For more examples both general and Actor-specific, check [integrations](./index.mdx).
-## Integrating with Apify
+## Integrate with Apify
To integrate your service with Apify, you have two options:
@@ -43,11 +43,11 @@ To integrate your service with Apify, you have two options:

-### Building an integration Actor
+### Build an integration Actor
One way to reach out to Apify users is directly within [Apify Console](https://console.apify.com). To do that, you need to build an integrable Actor that can be piped into other Actors to upload existing data into a database. This can then be easily configured within Apify Console. Follow the [guide on building integration-ready Actors](./actors/integration_ready_actors.md).
-### Building an external integration
+### Build an external integration
An alternative way is to let your users manage the connection directly on your side using [Apify API](https://docs.apify.com/api/v2) and our API clients for [JavaScript](/api/client/js/) or [Python](/api/client/python/). This way, users can manage the connection directly from your service.
@@ -155,7 +155,7 @@ Users create their own Apify accounts and are billed directly by Apify for their
Users access Apify through your platform without needing an Apify account. Apify bills you based on consumption, and you factor costs into your pricing.
-### Monitoring and tracking
+### Monitor and track
To help Apify monitor and support your integration, every API request should identify your platform. You can do this in one of two ways:
@@ -184,12 +184,12 @@ For inspiration, check out the public repositories of Apify's existing external
- Zapier
- [Zapier integration documentation](https://docs.apify.com/platform/integrations/zapier)
- - [Source code on Github](https://github.com/apify/apify-zapier-integration)
+ - [Source code on GitHub](https://github.com/apify/apify-zapier-integration)
- Make.com
- [Make.com integration documentation](https://docs.apify.com/platform/integrations/make)
- Kestra
- [Kestra integration documentation](https://kestra.io/plugins/plugin-apify)
- - [Source code on Github](https://github.com/kestra-io/plugin-apify)
+ - [Source code on GitHub](https://github.com/kestra-io/plugin-apify)
- Keboola
- [Keboola integration documentation](https://docs.apify.com/platform/integrations/keboola)
- [Source code on GitHub](https://github.com/apify/keboola-ex-apify/) (JavaScript)
diff --git a/sources/platform/integrations/programming/api.md b/sources/platform/integrations/programming/api.md
index 6923b2d64f..20bb1762f6 100644
--- a/sources/platform/integrations/programming/api.md
+++ b/sources/platform/integrations/programming/api.md
@@ -16,7 +16,7 @@ If you want to use the Apify API from JavaScript/Node.js or Python, we strongly
- [**apify-client**](/api/client/js/) `npm` package for JavaScript, supporting both browser and server
- [**apify-client**](/api/client/python/) PyPI package for Python.
-You are not required to those packages—the REST API works with any HTTP client—but the official API clients implement best practices such as exponential backoff and rate limiting.
+You are not required to those packages - the REST API works with any HTTP client - but the official API clients implement best practices such as exponential backoff and rate limiting.
## API token
@@ -24,7 +24,7 @@ To access the Apify API in your integrations, you need to authenticate using you

-:::caution
+:::caution Protect your API token
Do not share the API token with untrusted parties, or use it directly from client-side code,
unless you fully understand the consequences! You can also consider [limiting the permission scope](#limited-permissions) of the token, so that it can only access what it really needs.
:::
@@ -69,7 +69,7 @@ By default, tokens can access all data in your account. If that is not desirable
**A scoped token can access only those resources that you'll explicitly allow it to.**
-:::info
+:::info Actor modification restrictions
We do not allow scoped tokens to create or modify Actors. If you do need to create or modify Actors through Apify API, use an unscoped token.
:::
@@ -89,19 +89,19 @@ We support two different types of permissions for tokens:
- **Resource-specific permissions**: These will apply only to specific, existing resources. For example, you can use these to allow the token to read from a particular dataset.
-:::tip
+:::tip Combine permission types
A single token can combine both types. You can create a token that can _read_ any data storage, but _write_ only to one specific key-value store.
:::

-### Allowing tokens to create resources
+### Allow tokens to create resources
If you need to create new resources with the token (for example, create a new task, or storage), you need to explicitly allow that as well.
Once you create a new resource with the token, _the token will gain full access to that resource_, regardless of other permissions. It is not possible to create a token that can create a dataset, but not write to it.
-:::tip
+:::tip Dynamic resource creation
This is useful if you want to for example create a token that can dynamically create & populate datasets, but without the need to access other datasets in your account.
:::
@@ -124,7 +124,7 @@ Specifically:
- To create or update a Schedule, the token needs access not only to the Schedule itself, but also to the Actor (the **Run** permission) or task (the **Read** permission) that is being scheduled.
- Similarly, to create, update or run a task, the token needs the **Run** permission on the task's Actor itself.
-:::tip
+:::tip Schedule creation example
Let's say that you have an Actor and you want to programmatically create schedules for that Actor. Then you can create a token that has the account level **Create** permission on schedules, but only the resource-specific **Run** permission on the Actor. Such a token has exactly the permissions it needs, and nothing more.
:::
@@ -147,7 +147,7 @@ When you run an Actor with a scoped token in this mode, Apify will inject an _un
This way you can be sure that once you give a token the permission to run an Actor, it will just work, and you don't have to worry about the exact permissions the Actor might need. However, this also means that you need to trust the Actor.
-:::tip
+:::tip Third-party integration
Use this mode if you want to integrate with a 3rd-party service to run your Actors. Create a scoped token that can only run the Actor you need, and share it with the service. Even if the token is leaked, it can't be used to access your other data.
:::
@@ -155,12 +155,13 @@ Use this mode if you want to integrate with a 3rd-party service to run your Acto
When you run an Actor with a scoped token in this mode, Apify will inject a token with the same scope as the scope of the original token.
-This way you can be sure that Actors won't accidentally—or intentionally—access any data they shouldn't. However, Actors might not function properly if the scope is not sufficient.
+This way you can be sure that Actors won't accidentally - or intentionally - access any data they shouldn't. However, Actors might not function properly if the scope is not sufficient.
-:::caution
+:::caution Standby mode limitation
Restricted access mode is not supported for Actors running in [Standby mode](/platform/actors/running/standby). While you can send standby requests using a scoped token configured with restricted access, functionality is not guaranteed.
+:::
-:::tip
+:::tip Transitive restrictions
This restriction is _transitive_, which means that if the Actor runs another Actor, its access will be restricted as well.
:::
@@ -180,7 +181,7 @@ If the toggle is **off**, the token can still trigger and inspect runs, but acce
- For accounts with **Unrestricted general resource access**, the default storages can still be read anonymously using their IDs, but writing is prevented.
-:::tip
+:::tip Clean up run data
Let's say your Actor produces a lot of data that you want to delete just after the Actor finishes. If you enable this toggle, your scoped token will be allowed to do that.
:::
@@ -198,11 +199,11 @@ If you set up a webhook pointing to the Apify API, the Apify platform will autom
Therefore, you need to make sure the token has sufficient permissions not only to set up the webhook, but also to perform the actual operation.
-:::tip
+:::tip Webhook permissions
Let's say you want to create a webhook that pushes an item to a dataset every time an Actor successfully finishes. Then such a scoped token needs to be allowed to both run the Actor (to create the webhook), and write to that dataset.
:::
-### Troubleshooting
+### Troubleshoot scoped tokens
#### How do I allow a token to run a task?
diff --git a/sources/platform/integrations/workflows-and-notifications/bubble.md b/sources/platform/integrations/workflows-and-notifications/bubble.md
index 33292e9049..ed5856b613 100644
--- a/sources/platform/integrations/workflows-and-notifications/bubble.md
+++ b/sources/platform/integrations/workflows-and-notifications/bubble.md
@@ -63,7 +63,7 @@ When configuring Apify actions in a workflow (check out screenshot below), set t
- 
-## Using the integration
+## Use the integration
Once the plugin is configured, you can start building automated workflows.
@@ -166,8 +166,8 @@ There are two common approaches:
### Display data
- This example appends the text result of an Actor run; it's a basic bind to the element’s text.
-- Create / select the UI visual element — in this example, `Text`.
-- In the Appearance tab, click the input area, select Insert dynamic data, and, according to your case, find the source — in this example, it's the `key_value_storages's recordContentText` custom state, where I set the result of the API call
+- Create / select the UI visual element - in this example, `Text`.
+- In the Appearance tab, click the input area, select Insert dynamic data, and, according to your case, find the source - in this example, it's the `key_value_storages's recordContentText` custom state, where I set the result of the API call
- 
### Display list of data
@@ -175,13 +175,13 @@ There are two common approaches:
- This example lists the current user's datasets and displays them in a repeating group.
- Add a **Repeating group** to the page.
1. Add data to a variable: create a custom state (for example, on the page) that will hold the list of datasets, and set it to the plugin's **List User Datasets** data call.
- - 
+ - 
1. Set the type: in the repeating group's settings, set **Type of content** to match the dataset object your variable returns.
- - 
+ - 
1. Bind the variable: set the repeating group's **Data source** to the variable from Step 1.
- - 
+ - 
- Inside the repeating group cell, bind dataset fields (for example, `Current cell's item name`, `id`, `createdAt`).
-- 
+- 
## Long‑running scrapes and Bubble time limits (async pattern)
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/index.md b/sources/platform/integrations/workflows-and-notifications/gumloop/index.md
index 923029920c..c7b21f60cd 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/index.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/index.md
@@ -38,7 +38,7 @@ Each tool has a corresponding Gumloop credit cost. Each Gumloop subscription com
| Get videos for a specific hashtag | Get Hashtag Videos | 3 credits/video |
| Show 5 most recent reviews for a restaurant | Get Place Reviews | 3 credits/review |
-## General integration (Apify Task Runner)
+## General integration (Apify task runner)
Gumloop's Apify task runner lets you run your Apify tasks directly inside Gumloop workflows. Scrape data with Apify, then process it with AI, send results via email, update spreadsheets, or connect to any of Gumloop's 100+ integrations.
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md
index 61e10bd92f..f473ba2916 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md
@@ -22,10 +22,10 @@ You can pull the following types of data from TikTok using Gumloop’s TikTok no
| Get hashtag videos | Fetch videos from TikTok hashtags with captions, engagement metrics, play counts, and author information. | 3 credits per item |
| Get profile videos | Get videos from TikTok user profiles with video metadata, engagement stats, music info, and timestamps. | 3 credits per item |
| Get profile followers | Retrieve followers or following lists from TikTok profiles, including usernames, follower counts, and bios. | 3 credits per item |
-| Get video details | Get comprehensive data on a specific TikTok video using its URL—includes engagement and video-level metrics. | 5 credits per item |
+| Get video details | Get comprehensive data on a specific TikTok video using its URL - includes engagement and video-level metrics. | 5 credits per item |
| Search videos | Search TikTok for videos and users using queries. Returns video details and user profile info. | 3 credits per item |
-## Retrieve Tiktok Data in Gumloop
+## Retrieve TikTok data in Gumloop
1. _Add the Gumloop TikTok MCP node_
diff --git a/sources/platform/integrations/workflows-and-notifications/kestra.md b/sources/platform/integrations/workflows-and-notifications/kestra.md
index dec808cd5d..5816af65af 100644
--- a/sources/platform/integrations/workflows-and-notifications/kestra.md
+++ b/sources/platform/integrations/workflows-and-notifications/kestra.md
@@ -1,16 +1,16 @@
---
title: Kestra integration
-description: Connect Apify with Kestra to orchestrate workflows — run flows, extract structured data, and react to Actor or task events.
+description: Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events.
sidebar_label: Kestra
sidebar_position: 7
slug: /integrations/kestra
---
-**Connect Apify with Kestra to orchestrate workflows — run flows, extract structured data, and react to Actor or task events.**
+**Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events.**
---
-[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data — all defined declaratively in YAML and orchestrated directly from the UI.
+[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data - all defined declaratively in YAML and orchestrated directly from the UI.
This guide shows you how to set up the integration, configure authentication, and create a workflow that runs an Actor and processes its results.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md
index d1a08ebcf6..bb37bf09d9 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md
@@ -39,7 +39,7 @@ Once connected, you can build workflows to automate website extraction and integ
After connecting the app, you can use one of the two modules as native scrapers to extract website content.
-### Standard Settings Module
+### Standard Settings module
The Standard Settings module is a streamlined component of the Website Content Crawler that allows you to quickly extract content from websites using optimized default settings. This module is perfect for extracting content from blogs, documentation sites, knowledge bases, or any text-rich website to feed into AI models.
@@ -95,7 +95,7 @@ For each crawled web page, you'll receive:
}
```
-### Advanced Settings Module
+### Advanced Settings module
The Advanced Settings module provides complete control over the content extraction process, allowing you to fine-tune every aspect of the crawling and transformation pipeline. This module is ideal for complex websites, JavaScript-heavy applications, or when you need precise control over content extraction.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md
index 5943aca327..266b1ead11 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/index.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/index.md
@@ -58,14 +58,14 @@ If you anticipate that the Actor run will exceed the timeout, use the asynchrono
:::
-The primary difference between the two methods is that the synchronous run waits for the Actor or task to finish and retrieves its output using the "Get Dataset Items" module. By contrast, the asynchronous run watches for the run of an Actor or task (which could have been triggered from another scenario, manually from Apify console or elsewhere) and gets its output once it finishes.
+The primary difference between the two methods is that the synchronous run waits for the Actor or task to finish and retrieves its output using the "Get Dataset Items" module. By contrast, the asynchronous run watches for the run of an Actor or task (which could have been triggered from another scenario, manually from Apify Console or elsewhere) and gets its output once it finishes.
### Synchronous run using the action module
In this example, we will demonstrate how to run an Actor synchronously and export the output to Google Sheets.
The same principle applies to module that runs a task.
-#### Step 1: Add the Apify "Run an Actor" Module
+#### Step 1: Add the Apify "Run an Actor" module
First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify).
Next, add the Apify module called "Run an Actor" to your scenario and configure it.
@@ -97,9 +97,9 @@ You’re all set! Once the scenario is started, it will run the Actor synchronou
### Asynchronous run using the trigger module
In this example, we will demonstrate how to run an Actor asynchronously and export its output to Google Sheets.
-Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify console, on a schedule, or from a separate Make.com scenario.
+Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify Console, on a schedule, or from a separate Make.com scenario.
-#### Step 1: Add the Apify "Watch Actor Runs" Module
+#### Step 1: Add the Apify "Watch Actor Runs" module
First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify).
Next, add the Apify module called "Watch Actor Runs" to your scenario. This module will set up a webhook to listen for the finished runs of the selected Actor.
@@ -125,7 +125,7 @@ In the "Spreadsheet ID" field, enter the ID of the target Google Sheets file, wh

That’s it! Once the Actor run is complete, its data will be exported to the Google Sheets file.
-You can initiate the Actor run via the Apify console, a scheduler, or from another Make.com scenario.
+You can initiate the Actor run via the Apify Console, a scheduler, or from another Make.com scenario.
## Available modules and triggers
diff --git a/sources/platform/integrations/workflows-and-notifications/make/maps.md b/sources/platform/integrations/workflows-and-notifications/make/maps.md
index bfcf0fa87f..9cd01ae50b 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/maps.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/maps.md
@@ -57,7 +57,7 @@ The Search with Categories module is a component of the Google Maps Leads Scrape
- _Exact Name Matching_: Find businesses with exact or partial name matches
- _Operational Status Filter_: Exclude temporarily or permanently closed businesses
-#### How It Works
+#### How it works
The module allows you to combine category filtering with location parameters to discover relevant business leads, data mine reviews, or extract relevant Google Maps information. You can use categories alone or with specific search terms to create precisely targeted lead lists.
@@ -125,7 +125,7 @@ Categories can be general (e.g., "restaurant") which includes all variations lik
}
```
-### Search with Search Terms Module
+### Search with Search Terms module
The Search Terms module is a component of the Google Maps Leads Scraper designed to discover and extract business leads by using specific search queries, similar to how you'd search on Google Maps directly.
@@ -140,13 +140,13 @@ The Search Terms module is a component of the Google Maps Leads Scraper designed
- _Exact Name Matching_: Find businesses with exact or partial name matches
- _Operational Status Filter_: Exclude temporarily or permanently closed businesses
-#### How It Works
+#### How it works
This module allows you to enter search terms that match what you would typically type into the Google Maps search bar. You can search for general business types (like "coffee shop"), specific services ("dog grooming"), or product offerings ("organic produce").
The search results can be further refined using optional category filters, which help ensure you're capturing precisely the type of businesses you're targeting. For maximum efficiency, you can combine broader search terms with strategic category filters to capture the most relevant leads without excluding valuable prospects.
-### Advanced and Custom Search Module - Google Maps Leads Scraper
+### Advanced and Custom Search module - Google Maps Leads Scraper
The Advanced and Custom Search module is the most powerful component of the Google Maps Leads Scraper, designed for sophisticated lead generation campaigns that require precise geographic targeting and advanced search capabilities. This module gives you complete control over your lead discovery process with multiple location definition methods and advanced filtering options.
@@ -159,18 +159,18 @@ The Advanced and Custom Search module is the most powerful component of the Goog
- _Category Filtering_: Further refine results with optional category filters
- _Comprehensive Lead Filtering_: Apply multiple quality filters simultaneously for precise lead targeting
-#### How It Works
+#### How it works
This module provides the most flexible options for defining where and how to search for business leads:
-### Geographic Targeting Options
+### Geographic targeting options
- _Simple Location Query_: Use natural language location inputs like "New York, USA"
- _Structured Location Components_: Build precise locations using country, state, city, or county parameters
- _Postal Code Targeting_: Target specific postal/ZIP code areas for direct mail campaigns
- _Custom Polygon Areas_: Define exact geographic boundaries using coordinate pairs for ultra-precise targeting
-### Search and Filter Capabilities
+### Search and filter capabilities
- _Keyword-Based Search_: Discover businesses using industry, service, or product terms
- _Category-Based Filtering_: Apply Google's category system to refine results
diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/index.md b/sources/platform/integrations/workflows-and-notifications/n8n/index.md
index eecb6a3641..11a170db86 100644
--- a/sources/platform/integrations/workflows-and-notifications/n8n/index.md
+++ b/sources/platform/integrations/workflows-and-notifications/n8n/index.md
@@ -21,7 +21,7 @@ Before you begin, make sure you have:
- An [Apify account](https://console.apify.com/)
- An [n8n instance](https://docs.n8n.io/learning-path/) (self‑hosted or cloud)
-## Install the Apify Node (self-hosted)
+## Install the Apify node (self-hosted)
If you're running a self-hosted n8n instance, you can install the Apify community node directly from the editor. This process adds the node to your available tools, enabling Apify operations in workflows.
@@ -34,7 +34,7 @@ If you're running a self-hosted n8n instance, you can install the Apify communit

-## Install the Apify Node (n8n Cloud)
+## Install the Apify node (n8n Cloud)
For n8n Cloud users, installation is even simpler and doesn't require manual package entry. Just search and add the node from the canvas.
@@ -82,7 +82,7 @@ For simplicity on n8n Cloud, use the API key method if you prefer manual control
With authentication set up, you can now create workflows that incorporate the Apify node.
-## Create a Workflow with the Apify Node
+## Create a workflow with the Apify node
Start by building a basic workflow in n8n, then add the Apify node to handle tasks like running Actors or fetching data.
@@ -130,7 +130,7 @@ Actions allow you to perform operations like running an Actor within a workflow.
1. Save and execute the workflow

-## Use Apify Node as an AI tool
+## Use Apify node as an AI tool
You can run Apify operations, retrieve the results, and use AI to process, analyze, and summarize the data, or generate insights and recommendations.
diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md b/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md
index 4b79fc7fc9..99e4962e27 100644
--- a/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md
+++ b/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md
@@ -157,7 +157,7 @@ You can access any of thousands of our scrapers on Apify Store by using the [gen
You can select the _Crawler type_ by choosing the rendering engine (browser or HTTP client) and the _Content extraction algorithm_ from multiple HTML transformers. _Element selectors_ allow you to specify which elements to keep, remove, or click, while _URL patterns_ let you define inclusion and exclusion rules with glob syntax. You can also set _Crawling parameters_ like concurrency, depth, timeouts, and retries. For robust crawling, you can configure _Proxy configuration_ settings and select from various _Output options_ for content formats and storage.
-## Usage as an AI Agent Tool
+## Usage as an AI agent tool
You can setup Apify's Scraper for AI Crawling node as a tool for your AI Agents.
diff --git a/sources/platform/integrations/workflows-and-notifications/telegram.md b/sources/platform/integrations/workflows-and-notifications/telegram.md
index c69ee337bd..39d852d186 100644
--- a/sources/platform/integrations/workflows-and-notifications/telegram.md
+++ b/sources/platform/integrations/workflows-and-notifications/telegram.md
@@ -73,7 +73,7 @@ The connection is now created and the configuration form closed.
## Connect Telegram bot with Zapier
-### Step 1: Create & connect new bot on Telegram
+### Step 1: Create and connect new bot on Telegram
After setting up Apify as your trigger within Zapier, it's time to set up Telegram as the action that will occur based on the trigger.
diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md
index 188c3c697d..bc71fe17a1 100644
--- a/sources/platform/integrations/workflows-and-notifications/windmill.md
+++ b/sources/platform/integrations/workflows-and-notifications/windmill.md
@@ -60,7 +60,7 @@ You can import Apify integration scripts into your flows from the Windmill Hub,
You can provide the token to scripts via a **Windmill Resource**. Create it either in the **Resources** tab or directly from a script.
-#### Option A — Create in the Resources tab
+#### Option A - Create in the Resources tab
1. Open **Resources** → **New Resource**.
1. Select `apify_api_key` resource type.
@@ -69,7 +69,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith

-#### Option B — Create/bind from a script
+#### Option B - Create/bind from a script
1. Open the script in Windmill UI.
1. Add a secret input parameter (e.g., `apify_token`) .
@@ -78,7 +78,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith

-#### Option C — OAuth authentication
+#### Option C - OAuth authentication
:::note Cloud-only feature
@@ -190,7 +190,7 @@ Windmill provides webhook-based triggers that can automatically start workflows

-## Deleting the webhook
+## Delete the webhook
1. Fork the **Apify's Delete Webhook** script from the Windmill Hub.
1. Set either your _API Key_ or _OAuth Token_ resource
diff --git a/sources/platform/integrations/workflows-and-notifications/workato.md b/sources/platform/integrations/workflows-and-notifications/workato.md
index 76d52fe334..200ffe7acf 100644
--- a/sources/platform/integrations/workflows-and-notifications/workato.md
+++ b/sources/platform/integrations/workflows-and-notifications/workato.md
@@ -150,13 +150,13 @@ _Triggers when an Apify Actor run finishes (succeeds, fails, times out, or gets
This trigger monitors a specific Apify Actor and starts the recipe when any run of that Actor reaches a terminal status. You can:
-- Select the Actor from recently used Actors or Apify store Actors
+- Select the Actor from recently used Actors or Apify Store Actors
- Choose to trigger on specific statuses (`ACTOR.RUN.SUCCEEDED`, `ACTOR.RUN.FAILED`, `ACTOR.RUN.TIMED_OUT`, `ACTOR.RUN.ABORTED`)
- Access run details, status, and metadata in subsequent recipe steps

-### Task Run Finished
+### Task Run Finished
_Triggers when an Apify Task run finishes (succeeds, fails, times out, or gets aborted)._
@@ -178,7 +178,7 @@ _Run an Apify Actor with customizable execution parameters._
This action runs an Apify Actor with your specified input and execution parameters. You can choose to wait for completion or start the run asynchronously. Actors are reusable serverless programs that can scrape websites, process data, and automate workflows. You can:
-- Select from your recently used Actors or Apify store Actors
+- Select from your recently used Actors or Apify Store Actors
- Provide input using dynamic schema-based fields or raw JSON
- Configure run options like memory allocation, timeout, and build version
- Choose between synchronous (wait for completion) or asynchronous execution
@@ -205,7 +205,7 @@ This action runs an Apify Task with optional input overrides and execution param

-### Get Dataset Items
+### Get dataset items
_Retrieves items from a dataset with dynamic field mapping._
@@ -215,7 +215,7 @@ Select a dataset to dynamically generate output fields and retrieve its items. T
- Retrieves data records from specified datasets with pagination support
- Returns structured data ready for downstream recipe steps
-#### Dynamic Schema Detection
+#### Dynamic schema detection
The connector samples your dataset to create appropriate output fields:
@@ -231,7 +231,7 @@ For optimal results, use datasets where all items follow a consistent structure.

-### Get Key-value store Record
+### Get key-value store record
_Retrieves a single record from a Key-value store._
diff --git a/sources/platform/integrations/workflows-and-notifications/zapier.md b/sources/platform/integrations/workflows-and-notifications/zapier.md
index b72c572746..c20c564b26 100644
--- a/sources/platform/integrations/workflows-and-notifications/zapier.md
+++ b/sources/platform/integrations/workflows-and-notifications/zapier.md
@@ -94,11 +94,11 @@ Once you are happy with the test, you can publish the Zap. When it is turned on,
## Triggers
-### Finished Actor Run
+### Finished Actor run
> Triggers when a selected Actor run is finished.
-### Finished Task Run
+### Finished task run
> Triggers when a selected Actor task run is finished.
@@ -123,15 +123,15 @@ Once you are happy with the test, you can publish the Zap. When it is turned on,
## Searches
-### Fetch Dataset Items
+### Fetch dataset items
> Retrieves items from a [dataset](/platform/storage/dataset).
-### Find Last Actor Run
+### Find last Actor run
> Finds the most recent Actor run.
-### Find Last Task Run
+### Find last task run
> Finds the most recent Actor task run.
diff --git a/sources/platform/proxy/datacenter_proxy.md b/sources/platform/proxy/datacenter_proxy.md
index 1cf81cb789..9ed2300b93 100644
--- a/sources/platform/proxy/datacenter_proxy.md
+++ b/sources/platform/proxy/datacenter_proxy.md
@@ -47,7 +47,7 @@ This feature is also useful if you have your own pool of proxy servers and still
Prices for dedicated proxy servers are mainly based on the number of proxy servers, their type, and location. [Contact us](https://apify.com/contact) for more information.
-## Connecting to datacenter proxies
+## Connect to datacenter proxies
By default, each proxied HTTP request is potentially sent via a different target proxy server, which adds overhead and could be potentially problematic for websites which save cookies based on IP address.
diff --git a/sources/platform/proxy/google_serp_proxy.md b/sources/platform/proxy/google_serp_proxy.md
index ddc39a0f7b..438bc06ba4 100644
--- a/sources/platform/proxy/google_serp_proxy.md
+++ b/sources/platform/proxy/google_serp_proxy.md
@@ -24,7 +24,7 @@ Our Google SERP proxy currently supports the below services.
When using the proxy, **pricing is based on the number of requests made**.
-## Connecting to Google SERP proxy
+## Connect to Google SERP proxy
Requests made through the proxy are automatically routed through a proxy server from the selected country and pure **HTML code of the search result page is returned**.
@@ -61,7 +61,7 @@ See a [full list](https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXW
## Examples
-### Using the Apify SDK
+### Use the Apify SDK
If you are developing your own Apify [Actor](../actors/index.mdx) using the [Apify SDK](/sdk) and [Crawlee](https://crawlee.dev/), the most efficient way to use Google SERP proxy is [CheerioCrawler](https://crawlee.dev/api/cheerio-crawler/class/CheerioCrawler). This is because Google SERP proxy [only returns a page's HTML](./index.md). Alternatively, you can use the [got-scraping](https://github.com/apify/got-scraping) [npm package](https://www.npmjs.com/package/got-scraping) by specifying the proxy URL in the options. For Python, you can leverage the [`requests`](https://pypi.org/project/requests/) library along with the Apify SDK.
@@ -145,7 +145,7 @@ await Actor.exit();
-### Using standard libraries and languages
+### Use standard libraries and languages
You can find your proxy password on the [Proxy page](https://console.apify.com/proxy/access) of Apify Console.
diff --git a/sources/platform/proxy/residential_proxy.md b/sources/platform/proxy/residential_proxy.md
index f65470e661..77fac37bdc 100644
--- a/sources/platform/proxy/residential_proxy.md
+++ b/sources/platform/proxy/residential_proxy.md
@@ -20,7 +20,7 @@ Residential proxies support [IP address rotation](./usage.md#ip-address-rotation
**Pricing is based on data traffic**. It is measured for each connection made and displayed on your [proxy usage dashboard](https://console.apify.com/proxy/usage) in the Apify Console.
-## Connecting to residential proxy
+## Connect to residential proxy
Connecting to residential proxy works the same way as [datacenter proxy](./datacenter_proxy.md), with two differences.
diff --git a/sources/platform/proxy/usage.md b/sources/platform/proxy/usage.md
index 0203aeb2f4..86d5a83898 100644
--- a/sources/platform/proxy/usage.md
+++ b/sources/platform/proxy/usage.md
@@ -19,24 +19,24 @@ The full connection string has the following format:
http://:@:
```
-:::caution
+:::caution Password security
All usage of Apify Proxy with your password is charged towards your account. Do not share the password with untrusted parties or use it from insecure networks, as **the password is sent unencrypted** due to the HTTP protocol's [limitations](https://www.guru99.com/difference-http-vs-https.html).
:::
### External connection
-If you want to connect to Apify Proxy from outside of the Apify Platform, you need to have a paid Apify plan (to prevent abuse).
+If you want to connect to Apify Proxy from outside of the Apify platform, you need to have a paid Apify plan (to prevent abuse).
If you need to test Apify Proxy before you subscribe, please [contact our support](https://apify.com/contact).
-| Parameter | Value / explanation |
-|---------------------|---------------------|
-| Hostname | `proxy.apify.com`|
-| Port | `8000` |
-| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username.|
-| Password | Apify Proxy password. Your password is displayed on the [Proxy](https://console.apify.com/proxy/groups) page in Apify Console.
**Note**: this is not your Apify account password. |
+| Parameter | Value / explanation |
+| :--- | :--- |
+| Hostname | `proxy.apify.com` |
+| Port | `8000` |
+| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username. |
+| Password | Apify Proxy password. Your password is displayed on the [Proxy](https://console.apify.com/proxy/groups) page in Apify Console.
**Note**: this is not your Apify account password. |
-:::caution
-If you use these connection parameters for connecting to Apify Proxy from your Actors running on the Apify Platform, the connection will still be considered external, it will not work on the Free plan, and on paid plans you will be charged for external data transfer. Please use the connection parameters from the [Connection from Actors](#connection-from-actors) section when using Apify Proxy from Actors.
+:::caution External connections
+If you use these connection parameters for connecting to Apify Proxy from your Actors running on the Apify platform, the connection will still be considered external, it will not work on the Free plan, and on paid plans you will be charged for external data transfer. Please use the connection parameters from the [Connection from Actors](#connection-from-actors) section when using Apify Proxy from Actors.
:::
Example connection string for external connections:
@@ -47,17 +47,17 @@ http://auto:apify_proxy_EaAFg6CFhc4eKk54Q1HbGDEiUTrk480uZv03@proxy.apify.com:800
### Connection from Actors
-If you want to connect to Apify Proxy from Actors running on the Apify Platform, the recommended way is to use built-in proxy configuration tools in the [Apify SDK JavaScript](/sdk/js/docs/guides/proxy-management) or [Apify SDK Python](/sdk/python/docs/concepts/proxy-management)
+If you want to connect to Apify Proxy from Actors running on the Apify platform, the recommended way is to use built-in proxy configuration tools in the [Apify SDK JavaScript](/sdk/js/docs/guides/proxy-management) or [Apify SDK Python](/sdk/python/docs/concepts/proxy-management)
If you don't want to use these helpers, and want to connect to Apify Proxy manually, you can find the right configuration values in [environment variables](../actors/development/programming_interface/environment_variables.md) provided to the Actor.
By using this configuration, you ensure that you connect to Apify Proxy directly through the Apify infrastructure, bypassing any external connection via the Internet, thereby improving the connection speed, and ensuring you don't pay for external data transfer.
-| Parameter | Source / explanation |
-|---------------------|---------------------|
-| Hostname | `APIFY_PROXY_HOSTNAME` environment variable |
-| Port | `APIFY_PROXY_PORT` environment variable |
-| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username.|
-| Password | `APIFY_PROXY_PASSWORD` environment variable |
+| Parameter | Source / explanation |
+| :--- | :--- |
+| Hostname | `APIFY_PROXY_HOSTNAME` environment variable |
+| Port | `APIFY_PROXY_PORT` environment variable |
+| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username. |
+| Password | `APIFY_PROXY_PASSWORD` environment variable |
Example connection string creation:
@@ -138,8 +138,8 @@ Web scrapers can rotate the IP addresses they use to access websites. They assig
Depending on whether you use a [browser](https://apify.com/apify/web-scraper) or [HTTP requests](https://apify.com/apify/cheerio-scraper) for your scraping jobs, IP address rotation works differently.
-* Browser—a different IP address is used for each browser.
-* HTTP request—a different IP address is used for each request.
+* Browser - a different IP address is used for each browser.
+* HTTP request - a different IP address is used for each request.
Use [sessions](#sessions) to control how you rotate IP addresses. See our guide [Anti-scraping techniques](/academy/anti-scraping/techniques) to learn more about IP address rotation and our findings on how blocking works.
@@ -169,7 +169,7 @@ If you need to allow communication to `apify.proxy.com`, add the following IP ad
To view your connection status to [Apify Proxy](https://apify.com/proxy), open the URL below in the browser using the proxy. [http://proxy.apify.com/](http://proxy.apify.com/). If the proxy connection is working, the page should look something like this:
-
+
To test that your requests are proxied and IP addresses are being [rotated](/academy/anti-scraping/techniques) correctly, open the following API endpoint via the proxy. It shows information about the client IP address.
diff --git a/sources/platform/schedules.md b/sources/platform/schedules.md
index ef13cc0048..be9b54e272 100644
--- a/sources/platform/schedules.md
+++ b/sources/platform/schedules.md
@@ -32,7 +32,7 @@ However, runs can be delayed because of a system overload or a server shutting d
Each schedule can be associated with a maximum of _10_ Actors and _10_ Actor tasks.
-## Setting up a new schedule
+## Set up a new schedule
Before setting up a new schedule, you should have the [Actor](./actors/index.mdx) or [task](./actors/running/tasks.md) you want to schedule prepared and tested.
diff --git a/sources/platform/security.md b/sources/platform/security.md
index ec13e53011..52b682ad20 100644
--- a/sources/platform/security.md
+++ b/sources/platform/security.md
@@ -109,6 +109,6 @@ Please adhere strictly to the following rules. Failure to do so may result in le
:::
-## Securing your data
+## Secure your data
The Apify platform provides you with multiple ways to secure your data, including [encrypted environment variables](./actors/development/programming_interface/environment_variables.md) for storing your configuration secrets and [encrypted input](./actors/development/actor_definition/input_schema/secret_input.md) for securing the input parameters of your Actors.
diff --git a/sources/platform/storage/dataset.md b/sources/platform/storage/dataset.md
index b56d6008d9..5bebec9207 100644
--- a/sources/platform/storage/dataset.md
+++ b/sources/platform/storage/dataset.md
@@ -378,7 +378,7 @@ This feature is also useful when customizing your RSS feeds generated for variou
By default, the whole result is wrapped in an `` element, while each page object is contained in an ` ` element. You can change this using the `xmlRoot` and `xmlRow` URL parameters when retrieving your data with a GET request.
-## Sharing
+## Share
You can grant [access rights](../collaboration/index.md) to your dataset through the **Share** button under the **Actions** menu. For more details, check the [full list of permissions](../collaboration/list_of_permissions.md).
@@ -386,7 +386,7 @@ You can also share datasets by link using their ID or name, depending on your ac
For one-off sharing of specific records when access is restricted, you can generate time-limited pre-signed URLs. See [Sharing restricted resources with pre-signed URLs](/platform/collaboration/general-resource-access#pre-signed-urls).
-### Sharing datasets between runs
+### Share datasets between runs
You can access a dataset from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run as long as you know its _name_ or _ID_.
diff --git a/sources/platform/storage/key_value_store.md b/sources/platform/storage/key_value_store.md
index 8d4bb85d6c..57c790b4e2 100644
--- a/sources/platform/storage/key_value_store.md
+++ b/sources/platform/storage/key_value_store.md
@@ -265,7 +265,7 @@ You can compress a record and use the [Content-Encoding request header](https://
_Using the [JavaScript SDK](/sdk/js/reference/class/KeyValueStore#setValue) or our [JavaScript API client](/api/client/js/reference/class/KeyValueStoreClient#setRecord) automatically compresses your files._ We advise utilizing the JavaScript API client for data compression prior to server upload and decompression upon retrieval, minimizing storage costs.
-## Sharing
+## Share
You can grant [access rights](../collaboration/index.md) to your key-value store through the **Share** button under the **Actions** menu. For more details check the [full list of permissions](../collaboration/list_of_permissions.md).
@@ -273,7 +273,7 @@ You can also share key-value stores by link using their ID or name, depending on
For one-off sharing of specific records when access is restricted, you can generate time-limited pre-signed URLs. See [Sharing restricted resources with pre-signed URLs](/platform/collaboration/general-resource-access#pre-signed-urls).
-### Sharing key-value stores between runs
+### Share key-value stores between runs
You can access a key-value store from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run as long as you know its _name_ or _ID_.
diff --git a/sources/platform/storage/request_queue.md b/sources/platform/storage/request_queue.md
index 32c0fa87b4..da7baecdcc 100644
--- a/sources/platform/storage/request_queue.md
+++ b/sources/platform/storage/request_queue.md
@@ -409,7 +409,7 @@ If the Actor processing the request fails, the lock expires, and the request is
In the following example, we demonstrate how you can use locking mechanisms to avoid concurrent processing of the same request across multiple Actor runs.
-:::info
+:::info Lock mechanism
The lock mechanism works on the client level, as well as the run level, when running the Actor on the Apify platform.
This means you can unlock or prolong the lock the locked request only if:
@@ -554,7 +554,7 @@ await Actor.exit();
A detailed tutorial on how to process one request queue with multiple Actor runs can be found in [Academy tutorials](https://docs.apify.com/academy/node-js/multiple-runs-scrape).
-## Sharing
+## Share
You can grant [access rights](../collaboration/index.md) to your request queue through the **Share** button under the **Actions** menu. For more details check the [full list of permissions](../collaboration/list_of_permissions.md).
@@ -562,7 +562,7 @@ You can also share request queues by link using their ID or name, depending on y
For one-off sharing of specific records when access is restricted, you can generate time-limited pre-signed URLs. See [Sharing restricted resources with pre-signed URLs](/platform/collaboration/general-resource-access#pre-signed-urls).
-### Sharing request queues between runs
+### Share request queues between runs
You can access a request queue from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run as long as you know its _name_ or _ID_.
diff --git a/sources/platform/storage/usage.md b/sources/platform/storage/usage.md
index ac55550620..980399cfa3 100644
--- a/sources/platform/storage/usage.md
+++ b/sources/platform/storage/usage.md
@@ -55,7 +55,7 @@ Additionally, you can quickly share the contents and details of your storage by

-These URLs link to API _endpoints_—the places where your data is stored. Endpoints that allow you to _read_ stored information do not require an [authentication token](/api/v2#authentication). Calls are authenticated using a hard-to-guess ID, allowing for secure sharing. However, operations such as _update_ or _delete_ require the authentication token.
+These URLs link to API _endpoints_ - the places where your data is stored. Endpoints that allow you to _read_ stored information do not require an [authentication token](/api/v2#authentication). Calls are authenticated using a hard-to-guess ID, allowing for secure sharing. However, operations such as _update_ or _delete_ require the authentication token.
> Never share a URL containing your authentication token, to avoid compromising your account's security.
@@ -77,11 +77,11 @@ With other request types and when using the `username~store-name`, however, you
For further details and a breakdown of each storage API endpoint, refer to the [API documentation](/api/v2/storage-datasets).
-### Apify API Clients
+### Apify API clients
-The Apify API Clients allow you to access your datasets from any Node.js or Python application, whether it's running on the Apify platform or externally.
+The Apify API clients allow you to access your datasets from any Node.js or Python application, whether it's running on the Apify platform or externally.
-You can visit [API Clients](/api) documentations for more information.
+You can visit [API clients](/api) documentations for more information.
### Apify SDKs
@@ -133,7 +133,7 @@ Go to the [API documentation](/api/v2#rate-limiting) for details and to learn wh
Apify securely stores your ten most recent runs indefinitely, ensuring your records are always accessible. Unnamed datasets and runs beyond the latest ten will be automatically deleted after 7 days unless otherwise specified. Named datasets are retained indefinitely.
-### Preserving your storages
+### Preserve your storages
To ensure indefinite retention of your storages, assign them a name. This can be done via Apify Console or through our API. First, you'll need your store's ID. You can find it in the details of the run that created it. In Apify Console, head over to your run's details and select the **Dataset**, **Key-value store**, or **Request queue** tab as appropriate. Check that store's details, and you will find its ID among them.
@@ -158,7 +158,7 @@ Named and unnamed storages are identical in all aspects except for their retenti
For example, storage names `janedoe~my-storage-1` and `janedoe~web-scrape-results` are easier to tell apart than the alphanumerical IDs `cAbcYOfuXemTPwnIB` and `CAbcsuZbp7JHzkw1B`.
-## Sharing
+## Share
You can grant [access rights](../collaboration/index.md) to other Apify users to view or modify your storages. Check the [full list of permissions](../collaboration/list_of_permissions.md).
@@ -172,7 +172,7 @@ If your storage resource is set to _restricted_, all API calls must include a va
:::
-### Sharing storages between runs
+### Share storages between runs
Storage can be accessed from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run, provided you have its _name_ or _ID_. You can access and manage storages from other runs using the same methods or endpoints as with storages from your current run.
@@ -190,7 +190,7 @@ Learn how restricted access works in [General resource access](/platform/collabo
:::
-## Deleting storages
+## Delete storages
Named storages are only removed upon your request.
You can delete storages in the following ways: