From cf50da9c0975665317578ce2abb17b903cbfe854 Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 08:21:21 +0000 Subject: [PATCH 1/9] docs: fix style guide violations in platform documentation MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fix 324 style guide violations across 66 platform documentation files: - Replace em dashes (—) with hyphen with spaces ( - ) for consistency - Add descriptive titles to all admonitions (:::note, :::tip, :::caution, etc.) - Convert gerund headings (-ing forms) to imperative or noun forms - Convert Title Case headings to sentence case These changes ensure full compliance with Apify documentation style guide standards for readability, consistency, and accessibility. Co-Authored-By: Claude Sonnet 4.5 --- .../development/actor_definition/docker.md | 6 +- .../dynamic_actor_memory/index.md | 2 +- .../input_schema/custom_error_messages.md | 2 +- .../input_schema/specification.md | 4 +- .../actor_definition/output_schema/index.md | 2 +- .../development/builds_and_runs/index.md | 4 +- .../builds_and_runs/state_persistence.md | 4 +- .../deployment/continuous_integration.md | 4 +- .../actors/development/permissions/index.md | 4 +- .../development/quick-start/start_locally.md | 2 +- sources/platform/actors/index.mdx | 4 +- sources/platform/actors/publishing/index.mdx | 2 +- .../publishing/monetize/pricing_and_costs.mdx | 2 +- .../platform/actors/running/actor_standby.md | 8 +-- sources/platform/actors/running/index.md | 4 +- .../actors/running/input_and_output.md | 4 +- .../actors/running/runs_and_builds.md | 2 +- sources/platform/actors/running/store.md | 68 +++++++++---------- .../collaboration/general-resource-access.md | 24 +++---- .../organization_account/how_to_use.md | 2 +- sources/platform/console/index.md | 2 +- .../console/two-factor-authentication.md | 10 +-- sources/platform/index.mdx | 2 +- sources/platform/integrations/actors/index.md | 2 +- .../actors/integration_ready_actors.md | 2 +- .../platform/integrations/ai/aws_bedrock.md | 2 +- sources/platform/integrations/ai/chatgpt.md | 2 +- sources/platform/integrations/ai/crewai.md | 4 +- sources/platform/integrations/ai/flowise.md | 2 +- sources/platform/integrations/ai/langflow.md | 6 +- sources/platform/integrations/ai/langgraph.md | 2 +- sources/platform/integrations/ai/lindy.md | 2 +- sources/platform/integrations/ai/mastra.md | 4 +- .../platform/integrations/ai/openai_agents.md | 4 +- sources/platform/integrations/ai/skyfire.md | 6 +- .../platform/integrations/ai/vercel-ai-sdk.md | 2 +- .../data-storage/airtable/index.md | 4 +- .../integrations/data-storage/keboola.md | 6 +- .../integrations/integrate_with_apify.md | 8 +-- .../platform/integrations/programming/api.md | 29 ++++---- .../workflows-and-notifications/bubble.md | 14 ++-- .../gumloop/index.md | 2 +- .../gumloop/tiktok.md | 4 +- .../workflows-and-notifications/kestra.md | 6 +- .../make/ai-crawling.md | 6 +- .../make/amazon.md | 2 +- .../workflows-and-notifications/make/index.md | 12 ++-- .../make/instagram.md | 4 +- .../workflows-and-notifications/make/maps.md | 14 ++-- .../workflows-and-notifications/n8n/index.md | 4 +- .../n8n/website-content-crawler.md | 2 +- .../workflows-and-notifications/slack.md | 4 +- .../workflows-and-notifications/telegram.md | 2 +- .../workflows-and-notifications/windmill.md | 14 ++-- .../workflows-and-notifications/workato.md | 10 +-- .../workflows-and-notifications/zapier.md | 10 +-- sources/platform/proxy/datacenter_proxy.md | 2 +- sources/platform/proxy/google_serp_proxy.md | 6 +- sources/platform/proxy/residential_proxy.md | 2 +- sources/platform/proxy/usage.md | 10 +-- sources/platform/schedules.md | 2 +- sources/platform/security.md | 4 +- sources/platform/storage/dataset.md | 4 +- sources/platform/storage/key_value_store.md | 4 +- sources/platform/storage/request_queue.md | 6 +- sources/platform/storage/usage.md | 16 ++--- 66 files changed, 211 insertions(+), 210 deletions(-) diff --git a/sources/platform/actors/development/actor_definition/docker.md b/sources/platform/actors/development/actor_definition/docker.md index 9b604a88ac..a22718dd32 100644 --- a/sources/platform/actors/development/actor_definition/docker.md +++ b/sources/platform/actors/development/actor_definition/docker.md @@ -126,7 +126,7 @@ When the Playwright/Puppeteer version in your `package.json` differs from what's ::: -### Using `*` as version (alternative approach) +### Use `*` as version (alternative approach) You may encounter older documentation or templates using `*` as the Playwright/Puppeteer version: @@ -208,7 +208,7 @@ You can check out various optimization tips for Dockerfile in our [Performance]( ::: -## Updating older Dockerfiles +## Update older Dockerfiles All Apify base Docker images now use a non-root user to enhance security. This change requires updates to existing Actor `Dockerfile`s that use the `apify/actor-node`, `apify/actor-python`, `apify/actor-python-playwright`, or `apify/actor-python-selenium` images. This section provides guidance on resolving common issues that may arise during this migration. @@ -293,7 +293,7 @@ You should remove these lines, as the new user is now `myuser`. Don't forget to COPY --chown=myuser:myuser . ./ ``` -#### Installing dependencies that require root access +#### Install dependencies that require root access The `root` user is still available in the Docker images. If you must run steps that require root access (like installing system packages with `apt` or `apk`), you can temporarily switch to the `root` user. diff --git a/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md b/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md index 32e863746a..c7450d2536 100644 --- a/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md +++ b/sources/platform/actors/development/actor_definition/dynamic_actor_memory/index.md @@ -218,7 +218,7 @@ If the calculation results in an error, the Actor will start with a fixed defaul -### Testing expressions +### Test expressions #### Use npm package diff --git a/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md b/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md index cea9169224..9fd8a2becc 100644 --- a/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md +++ b/sources/platform/actors/development/actor_definition/input_schema/custom_error_messages.md @@ -102,4 +102,4 @@ It's possible to define custom error messages in sub-properties as well. For obj ## Best practices -Custom error messages can be useful in specific cases, but they aren't always necessary. In most situations, the default validation messages are clear enough and ensure consistency across the platform. Use custom messages only when they meaningfully improve clarity—for example, when the default message would expose an unreadable regular expression or fail to explain a non-obvious requirement. +Custom error messages can be useful in specific cases, but they aren't always necessary. In most situations, the default validation messages are clear enough and ensure consistency across the platform. Use custom messages only when they meaningfully improve clarity - for example, when the default message would expose an unreadable regular expression or fail to explain a non-obvious requirement. diff --git a/sources/platform/actors/development/actor_definition/input_schema/specification.md b/sources/platform/actors/development/actor_definition/input_schema/specification.md index 9d8d43153e..122a723548 100644 --- a/sources/platform/actors/development/actor_definition/input_schema/specification.md +++ b/sources/platform/actors/development/actor_definition/input_schema/specification.md @@ -22,7 +22,7 @@ The Actor input schema file is used to: To define an input schema for an Actor, set `input` field in the `.actor/actor.json` file to an input schema object (described below), or path to a JSON file containing the input schema object. -For backwards compatibility, if the `input` field is omitted, the system looks for an `INPUT_SCHEMA.json` file either in the `.actor` directory or the Actor's top-level directory—but note that this functionality is deprecated and might be removed in the future. The maximum allowed size for the input schema file is 500 kB. +For backwards compatibility, if the `input` field is omitted, the system looks for an `INPUT_SCHEMA.json` file either in the `.actor` directory or the Actor's top-level directory - but note that this functionality is deprecated and might be removed in the future. The maximum allowed size for the input schema file is 500 kB. When you provide an input schema, the Apify platform will validate the input data passed to the Actor on start (via the API or Apify Console) to ensure compliance before starting the Actor. If the input object doesn't conform the schema, the caller receives an error and the Actor is not started. @@ -372,7 +372,7 @@ Properties: | Property | Value | Required | Description | |------------|-----------------------------------------------------|----------|-------------------------------------------------------------------------------| -| `type` | One of | Yes | Defines the type of the field — either an integer or a floating-point number. | +| `type` | One of | Yes | Defines the type of the field - either an integer or a floating-point number. | | `editor` | One of: | No | Visual editor used for input field. | | `maximum` | Integer or Number
(based on the `type`) | No | Maximum allowed value. | | `minimum` | Integer or Number
(based on the `type`) | No | Minimum allowed value. | diff --git a/sources/platform/actors/development/actor_definition/output_schema/index.md b/sources/platform/actors/development/actor_definition/output_schema/index.md index d4709248db..95a6545926 100644 --- a/sources/platform/actors/development/actor_definition/output_schema/index.md +++ b/sources/platform/actors/development/actor_definition/output_schema/index.md @@ -280,7 +280,7 @@ When a user runs the Actor in the Console, the UI will look like this: ![Video files in Output tab](images/output-schema-combination-example.png) -### Using container URL to display chat client +### Use container URL to display chat client In this example, an Actor runs a web server that provides a chat interface to an LLM. The conversation history is then stored in the dataset. diff --git a/sources/platform/actors/development/builds_and_runs/index.md b/sources/platform/actors/development/builds_and_runs/index.md index 5c3c0c71fb..590f2170a8 100644 --- a/sources/platform/actors/development/builds_and_runs/index.md +++ b/sources/platform/actors/development/builds_and_runs/index.md @@ -11,7 +11,7 @@ slug: /actors/development/builds-and-runs Actor **builds** and **runs** are fundamental concepts within the Apify platform. Understanding them is crucial for effective use of the platform. -## Building an Actor +## Build an Actor When you start the build process for your Actor, you create a _build_. A build is a Docker image containing your source code and the required dependencies needed to run the Actor: @@ -27,7 +27,7 @@ flowchart LR AD -- "build process" --> Build ``` -## Running an Actor +## Run an Actor To create a _run_, you take your _build_ and start it with some input: diff --git a/sources/platform/actors/development/builds_and_runs/state_persistence.md b/sources/platform/actors/development/builds_and_runs/state_persistence.md index 218b88f387..2c2c48972c 100644 --- a/sources/platform/actors/development/builds_and_runs/state_persistence.md +++ b/sources/platform/actors/development/builds_and_runs/state_persistence.md @@ -21,7 +21,7 @@ To prevent data loss, long-running Actors should: For short-running Actors, the risk of restarts and the cost of repeated runs are low, so you can typically ignore state persistence. -## Understanding migrations +## Understand migrations A migration occurs when a process running on one server must stop and move to another. During this process: @@ -45,7 +45,7 @@ Migrations don't follow a specific schedule. They can occur at any time due to t By default, an Actor keeps its state in the server's memory. During a server switch, the run loses access to the previous server's memory. Even if data were saved on the server's disk, access to that would also be lost. Note that the Actor run's default dataset, key-value store and request queue are preserved across migrations, by state we mean the contents of runtime variables in the Actor's code. -## Implementing state persistence +## Implement state persistence The [Apify SDKs](/sdk) handle state persistence automatically. diff --git a/sources/platform/actors/development/deployment/continuous_integration.md b/sources/platform/actors/development/deployment/continuous_integration.md index 9f7d0e3bc7..e951180c79 100644 --- a/sources/platform/actors/development/deployment/continuous_integration.md +++ b/sources/platform/actors/development/deployment/continuous_integration.md @@ -32,7 +32,7 @@ Set up continuous integration for your Actors using one of these methods: Choose the method that best fits your workflow. -## Option 1: Trigger builds with a Webhook +## Option 1: trigger builds with a webhook 1. Push your Actor to a GitHub repository. 1. Go to your Actor's detail page in Apify Console, click on the API tab in the top right, then select API Endpoints. Copy the **Build Actor** API endpoint URL. The format is as follows: @@ -54,7 +54,7 @@ Choose the method that best fits your workflow. Now your Actor will automatically rebuild on every push to the GitHub repository. -## Option 2: Set up automated builds and tests with GitHub Actions +## Option 2: set up automated builds and tests with GitHub actions 1. Push your Actor to a GitHub repository. 1. Get your Apify API token from the [Apify Console](https://console.apify.com/settings/integrations) diff --git a/sources/platform/actors/development/permissions/index.md b/sources/platform/actors/development/permissions/index.md index 181990ce0b..2329a3ed74 100644 --- a/sources/platform/actors/development/permissions/index.md +++ b/sources/platform/actors/development/permissions/index.md @@ -45,7 +45,7 @@ To learn how to migrate your Actors to run under limited permissions, check out ::: -### Configuring Actor permissions level +### Configure Actor permissions level You can set the permission level for your Actor in the Apify Console under its **Settings** tab. New Actors are configured to use limited permissions by default. Older Actors might still use full permissions until you update their configuration. @@ -66,7 +66,7 @@ When possible, design your Actors to use limited permissions and request only th ::: -### Accessing user provided storages +### Access user provided storages By default, limited-permissions Actors can't access user storages. However, they can access storages that users explicitly provide via the Actor input. To do so, use the input schema to add a storage picker and declare exactly which operations your Actor needs. diff --git a/sources/platform/actors/development/quick-start/start_locally.md b/sources/platform/actors/development/quick-start/start_locally.md index a107d14db2..406d39f064 100644 --- a/sources/platform/actors/development/quick-start/start_locally.md +++ b/sources/platform/actors/development/quick-start/start_locally.md @@ -143,7 +143,7 @@ Let's now deploy your Actor to the Apify platform, where you can run the Actor o apify push ``` -### Step 5: It's Time to Iterate! +### Step 5: it's time to iterate! Good job! 🎉 You're ready to develop your Actor. You can make changes to your Actor and implement your use case. diff --git a/sources/platform/actors/index.mdx b/sources/platform/actors/index.mdx index bfd6843002..25b1c4e938 100644 --- a/sources/platform/actors/index.mdx +++ b/sources/platform/actors/index.mdx @@ -6,7 +6,7 @@ category: platform slug: /actors --- -**Learn how to run, develop, and publish Apify Actors — serverless cloud programs for web data extraction and workflow automation.** +**Learn how to run, develop, and publish Apify Actors - serverless cloud programs for web data extraction and workflow automation.** import Card from "@site/src/components/Card"; import CardGrid from "@site/src/components/CardGrid"; @@ -61,7 +61,7 @@ Build Actors to automate tasks, scrape data, or create custom workflows. The Api Ready to start? Check out the [Actor development documentation](/platform/actors/development). -## Running Actors +## Run Actors You can run Actors manually in [Apify Console](https://console.apify.com/actors), using the [API](/api), [CLI](/cli), or [scheduler](../schedules.md). You can easily [integrate Actors](../integrations/index.mdx) with other apps, [share](../collaboration/access_rights.md) them with other people, [publish](./publishing/index.mdx) them in [Apify Store](https://apify.com/store), and even [monetize](./publishing/monetize/index.mdx). diff --git a/sources/platform/actors/publishing/index.mdx b/sources/platform/actors/publishing/index.mdx index b781ddec57..5afccca102 100644 --- a/sources/platform/actors/publishing/index.mdx +++ b/sources/platform/actors/publishing/index.mdx @@ -57,7 +57,7 @@ To ensure long-term quality and improve your chances of successfully monetizing If you decide to make your Actor's code publicly available on [GitHub](https://github.com), code quality becomes even more crucial, as your Actor may be the first experience some users have with Apify. -### Handling breaking changes +### Handle breaking changes While refactoring and updating your Actor's code is encouraged, be cautious of making changes that could break the Actor for existing users. If you plan to introduce breaking change, please contact us at [community@apify.com](mailto:community@apify.com) beforehand, and we'll assist you in communicating the change to your users. diff --git a/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx b/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx index d58140917c..91f42fe9b7 100644 --- a/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx +++ b/sources/platform/actors/publishing/monetize/pricing_and_costs.mdx @@ -64,7 +64,7 @@ While optional, we recommend offering progressively lower prices for higher disc Your platform costs are also lower for these higher tier, which helps maintain healthy profit margins. This is further detailed in the [Computing your costs for PPE and PPR Actors](#computing-your-costs-for-ppe-and-ppr-actors) section. -## Implementing discount tiers +## Implement discount tiers By default, we advise against setting excessively high prices for _FREE_ tier users, as this can limit the ability to evaluate your Actor thoroughly. However, in certain situations, such as protecting your Actor from fraudulent activity or excessive use of your internal APIs, a higher price for _FREE_ tier users might be justified. diff --git a/sources/platform/actors/running/actor_standby.md b/sources/platform/actors/running/actor_standby.md index b2c171b477..a7ebc58970 100644 --- a/sources/platform/actors/running/actor_standby.md +++ b/sources/platform/actors/running/actor_standby.md @@ -14,7 +14,7 @@ Traditional Actors are designed to run a single job and then stop. They're mostl However, in some applications, waiting for an Actor to start is not an option. Actor Standby mode solves this problem by letting you have the Actor ready in the background, waiting for the incoming HTTP requests. In a sense, the Actor behaves like a real-time web server or standard API server. -## How do I know if Standby mode is enabled +## How do i know if standby mode is enabled You will know that the Actor is enabled for Standby mode if you see the **Standby** tab on the Actor's detail page. In the tab, you will find the hostname of the server, the description of the Actor's endpoints, @@ -25,7 +25,7 @@ hit the API endpoint and get results. ![Standby tab](./images/actor_standby/standby-tab.png) -## How do I pass input to Actors in Standby mode +## How do i pass input to Actors in standby mode If you're using an Actor built by someone else, see its Information tab to find out how the input should be passed. @@ -75,7 +75,7 @@ For requests sent to an Actor in Standby mode, the maximum time allowed until re The rate limit for incoming requests to a Standby Actor is _2000 requests per second_ per user account. -## How do I customize Standby configuration +## How do i customize standby configuration The Standby configuration currently consists of the following properties: @@ -99,6 +99,6 @@ However, running Actors in Standby mode might have unexpected costs, as the Acto No, even if you use the Actor-level hostname with the default configuration, the background Actor runs for your requests are not shared with other users. -## How can I develop Actors using Standby mode +## How can i develop Actors using standby mode See the [Actor Standby development section](../development/programming_interface/actor_standby.md). diff --git a/sources/platform/actors/running/index.md b/sources/platform/actors/running/index.md index 88be596a39..d5b103f52c 100644 --- a/sources/platform/actors/running/index.md +++ b/sources/platform/actors/running/index.md @@ -54,7 +54,7 @@ And you can use the export button at the bottom left to export the data in multi And that's it! Now you can get back to the Actor's input, play with it, and try out more of the [Apify Actors](https://apify.com/store) or [build your own](./development). -## Running via Apify API +## Run via Apify API Actors can also be invoked using the Apify API by sending an HTTP POST request to the [Run Actor](/api/v2/#/reference/actors/run-collection/run-actor) endpoint, such as: @@ -66,7 +66,7 @@ An Actor's input and its content type can be passed as a payload of the POST req > To learn more about this, read the [Run an Actor or task and retrieve data via API](/academy/api/run-actor-and-retrieve-data-via-api) tutorial. -## Running programmatically +## Run programmatically Actors can also be invoked programmatically from your own applications or from other Actors. diff --git a/sources/platform/actors/running/input_and_output.md b/sources/platform/actors/running/input_and_output.md index 05f5a441ac..920d6b4ab8 100644 --- a/sources/platform/actors/running/input_and_output.md +++ b/sources/platform/actors/running/input_and_output.md @@ -27,7 +27,7 @@ When running an Actor using the [API](https://docs.apify.com/api/v2) you can pas } ``` -### Options - Build, Timeout, and Memory +### Options - Build, timeout, and memory As part of the input, you can also specify run options such as [Build](../development/builds_and_runs/builds.md), Timeout, and [Memory](./usage_and_resources.md) for your Actor run. @@ -41,7 +41,7 @@ As part of the input, you can also specify run options such as [Build](../develo :::info Dynamic memory -If the Actor is configured by developer to use [dynamic memory](../development/actor_definition/dynamic_actor_memory/index.md), the system will calculate the optimal memory allocation based on your input. In this case, the **Memory** option acts as an override — if you set it, the calculated value will be ignored. +If the Actor is configured by developer to use [dynamic memory](../development/actor_definition/dynamic_actor_memory/index.md), the system will calculate the optimal memory allocation based on your input. In this case, the **Memory** option acts as an override - if you set it, the calculated value will be ignored. ::: diff --git a/sources/platform/actors/running/runs_and_builds.md b/sources/platform/actors/running/runs_and_builds.md index 9accc07dae..b2f33dfb70 100644 --- a/sources/platform/actors/running/runs_and_builds.md +++ b/sources/platform/actors/running/runs_and_builds.md @@ -127,6 +127,6 @@ Apify securely stores your ten most recent runs indefinitely, ensuring your reco **Actor builds** are deleted only when they are _not tagged_ and have not been used for over 90 days. -## Sharing +## Share Share your Actor runs with other Apify users via the [access rights](../../collaboration/index.md) system. diff --git a/sources/platform/actors/running/store.md b/sources/platform/actors/running/store.md index a79b426235..359f1a3ffd 100644 --- a/sources/platform/actors/running/store.md +++ b/sources/platform/actors/running/store.md @@ -28,9 +28,9 @@ All Actors in [Apify Store](https://apify.com/store) fall into one of the four p 3. [**Pay per event**](#pay-per-event) - you pay for specific events the Actor creator defines, such as generating a single result or starting the Actor. Most Actors include platform usage in the price, but some may charge it separately — check the Actor's pricing for details. 4. [**Pay per usage**](#pay-per-usage) - you can run the Actor and you pay for the platform usage the Actor generates. -### Rental Actors +### Rental actors -Rental Actors are Actors for which you have to pay a recurring fee to the developer after your trial period ends. This empowers the developer to dedicate more time and effort to their Actors, thus ensuring they are of the _highest quality_ and receive _ongoing maintenance_. +Rental actors are Actors for which you have to pay a recurring fee to the developer after your trial period ends. This empowers the developer to dedicate more time and effort to their Actors, thus ensuring they are of the _highest quality_ and receive _ongoing maintenance_. Most rental Actors have a _free trial_ period. The length of the trial is displayed on each Actor's page. @@ -39,28 +39,28 @@ Most rental Actors have a _free trial_ period. The length of the trial is displa After a trial period, a flat monthly _Actor rental_ fee is automatically subtracted from your prepaid platform usage in advance for the following month. Most of this fee goes directly to the developer and is paid on top of the platform usage generated by the Actor. You can read more about our motivation for releasing rental Actors in [this blog post](https://blog.apify.com/make-regular-passive-income-developing-web-automation-actors-b0392278d085/) from Apify's CEO Jan Čurn. -#### Rental Actors - Frequently Asked Questions +#### Rental actors - Frequently asked questions -##### Can I run rental Actors via API or the Apify client? +##### Can I run rental actors via API or the Apify client? Yes, when you are renting an Actor, you can run it using either our [API](/api/v2), [JavaScript](/api/client/js) or [Python](/api/client/python) clients as you would do with private or free public Actors. -##### Do I pay platform costs for running rental Actors? +##### Do I pay platform costs for running rental actors? [//]: # (TODO better link for platform usage costs explaining what it is!) Yes, you will pay normal [platform usage costs](https://apify.com/pricing) on top of the monthly Actor rental fee. The platform costs work exactly the same way as for free public Actors or your private Actors. You should find estimates of the cost of usage in each individual rental Actor's README ([see an example](https://apify.com/compass/crawler-google-places#how-much-will-it-cost)). -##### Do I need an Apify paid plan to use rental Actors? +##### Do I need an Apify paid plan to use rental actors? You don't need a paid plan to start a rental Actor's free trial. Just activate the trial, and you are good to go. After that, you will need to subscribe to one of [Apify's paid plans](https://apify.com/pricing) in order to keep renting the Actor and continue using it. -##### When will I be charged for the Actor rental? +##### When will I be charged for the actor rental? You always prepay the Actor rental for the following month. The first payment happens when the trial expires, and then recurs monthly. When you open the Actor in the Apify Console, you will see when the next rental payment is due, and you will also receive a notification when it happens. _Example_: You activate a 7-day trial of an Actor at _noon of April 1, 2021_. If you don't turn off auto-renewal, you will be charged at _noon on April 8, 2021_, then _May 8, 2021_. -##### How am I charged for Actor rental? +##### How am I charged for actor rental? The rental fee for an Actor is automatically subtracted from your prepaid platform usage, similarly to, e.g. [compute units](./usage_and_resources.md). If you don't have enough usage prepaid, you will need to cover any overage in the next invoice. @@ -68,17 +68,17 @@ The rental fee for an Actor is automatically subtracted from your prepaid platfo If you have an [Apify paid plan](https://apify.com/pricing), the monthly rental fee will be automatically subtracted from your plan's prepaid usage at the end of your free trial, and you will be able to run the Actor for another month. If you are not subscribed to any of [Apify's paid plans](https://apify.com/pricing), you will need to subscribe to one in order to continue using the Actor after the trial has ended. -##### Can I cancel my Actor rental? +##### Can I cancel my actor rental? _You can cancel the Actor rental_ during your trial or any time after that so you don't get charged when your current Actor rental period expires. You can always turn it back on later if you want. -##### Where can I see how much I have paid for Actor rental? +##### Where can I see how much I have paid for actor rental? Since Actor rental fees are paid from prepaid platform usage, these fees conceptually belong under platform usage. -You can find the breakdown of how much you have been charged for rental Actors in the **Actors** tab, which you will find within the **Current period** tab in the [Billing](https://console.apify.com/billing) section. +You can find the breakdown of how much you have been charged for rental actors in the **Actors** tab, which you will find within the **Current period** tab in the [Billing](https://console.apify.com/billing) section. -![Rental Actors billing in Apify Console](./images/store/billing-paid-actors.png) +![Rental actors billing in Apify Console](./images/store/billing-paid-actors.png) ### Pay per result @@ -92,37 +92,37 @@ This makes it transparent and easy to estimate upfront costs. If you have any fe -#### Pay per result Actors - Frequently Asked Questions +#### Pay per result actors - Frequently asked questions -##### How do I know an Actor is paid per result? +##### How do I know an actor is paid per result? When you try the Actor on the platform, you will see that the Actor is paid per result next to the Actor name. ![Actor paid per result in Console](./images/store/console_pay_per_result_tag.png) -##### Do I need to pay a monthly rental fee to run the Actor? +##### Do I need to pay a monthly rental fee to run the actor? No, the Actor is free to run. You only pay for the results. ##### What happens when I interact with the dataset after the run finishes? -Under the **pay per result** model, all platform costs generated _during the run of an Actor_ are not charged towards your account; you pay for the results instead. After the run finishes, any interactions with the default dataset storing the results, such as reading the results or writing additional data, will incur the standard platform usage costs. But do not worry, in the vast majority of cases, you only want to read the result from the dataset and that costs near to nothing. +Under the **pay per result** model, all platform costs generated _during the run of an Actor_ are not charged towards your account; you pay for the results instead. After the run finishes, any interactions with the default dataset storing the results, such as reading the results or writing additional data, will incur the standard platform usage costs. But do not worry - in the vast majority of cases, you only want to read the result from the dataset and that costs near to nothing. ##### Do I pay for the storage of results on the Apify platform? You will still be charged for the timed storage of the data in the same fashion as with any other Actor. You can always decide to delete the dataset to reduce your costs after you export the data from the platform. By default, any unnamed dataset will be automatically removed after your data retention period, so usually, this is nothing to worry about. -##### Can I set a cap on how many results an Actor should return? +##### Can I set a cap on how many results an actor should return? You can set a limit on how many items an Actor should return and the amount you will be charged in Options on the Actor detail page in the section below the Actor input. ![Max items for pay-per-result](./images/store/max-items-for-pay-per-result.png) -##### Can I publish an Actor that is paid per result? +##### Can I publish an actor that is paid per result? Yes, you can publish an Actor that is paid per result. -##### Where do I see how much I was charged for the pay per result Actors? +##### Where do I see how much I was charged for the pay per result actors? You can see the overview of how much you have been charged for Actors paid by result on your invoices and in the [Usage tab](https://console.apify.com/billing) of the Billing section in Console. It will be shown there as a separate service. @@ -149,9 +149,9 @@ Most pay per event Actors include platform usage in the event price. However, so ::: -#### Pay per event Actors - Frequently Asked Questions +#### Pay per event actors - Frequently asked questions -#### How do I know Actor is paid per events? +#### How do I know actor is paid per events? You will see that the Actor is paid per events next to the Actor name. @@ -159,41 +159,41 @@ You will see that the Actor is paid per events next to the Actor name. ![Example pay per event Actor](./images/store/pay_per_event_example_actor.png) -#### Do I need to pay a monthly rental fee to run the Actor? +#### Do I need to pay a monthly rental fee to run the actor? No, you only pay for the events. #### What happens when I interact with the dataset after the run finishes? -You would still pay for all interactions after the Actor run finishes, same as for pay per result Actors. +You would still pay for all interactions after the Actor run finishes, same as for pay per result actors. #### Do I pay for the storage of results on the Apify platform? -You would still pay for the long term storage of results, same as for pay per result Actors. +You would still pay for the long term storage of results, same as for pay per result actors. -#### Do I need to pay for platform usage with pay per event Actors? +#### Do I need to pay for platform usage with pay per event actors? -In most cases, no — the majority of pay per event Actors include [platform usage](./usage_and_resources.md) in the event price, so you only pay for the events. However, some Actors may charge platform usage separately, in addition to the event costs. Always check the pricing section on the Actor's page—it clearly states whether platform usage is included or not. +In most cases, no - the majority of pay per event actors include [platform usage](./usage_and_resources.md) in the event price, so you only pay for the events. However, some Actors may charge platform usage separately, in addition to the event costs. Always check the pricing section on the Actor's page - it clearly states whether platform usage is included or not. ![Pay per event with usage not included in Apify Store](./images/store/pay_per_event_and_usage_example_actor.png) -#### Where do I see how much I was charged for the pay per event Actors? +#### Where do i see how much i was charged for the pay per event actors? You can see how much you have been charged on your invoices, and on the [Usage tab](https://console.apify.com/billing) of the Billing section in the Console. -![Pay per event Actor - historical usage tab](./images/store/pay_per_event_historical_usage_tab.png) +![Pay per event actor - historical usage tab](./images/store/pay_per_event_historical_usage_tab.png) You can also see the cost of each run on the run detail itself. -![Pay per event Actor - run detail](./images/store/pay_per_event_price_on_run_detail.png) +![Pay per event actor - run detail](./images/store/pay_per_event_price_on_run_detail.png) -#### Can I put a cap on a cost of a single Actor run? +#### Can I put a cap on a cost of a single actor run? Yes, when starting an Actor run, you can define the maximum limit on the cost of that run. When the Actor reaches the defined limit, it should terminate gracefully. Even if it didn't, for any reason, and kept producing results, we make always sure you are never charged more that your defined limit. -![Pay per event Actor - max charge per run](./images/store/pay_per_event_price_on_run_detail.png) +![Pay per event actor - max charge per run](./images/store/pay_per_event_price_on_run_detail.png) -#### How do I raise a dispute if the charges for an Actor seem off? +#### How do I raise a dispute if the charges for an actor seem off? Please, in such a case, do not hesitate to contact the Actor author or our support team. If you suspect a bug in the Actor, you can also always create an issue on the Actor detail in the Apify Console. @@ -211,13 +211,13 @@ _For more information on platform usage cost see the [usage and resources](./usa ::: -## Reporting issues with Actors +## Report issues with actors Each Actor has an **Issues** tab in Apify Console. There, you can open an issue (ticket) and chat with the Actor's author, platform admins, and other users of this Actor. Please feel free to use the tab to ask any questions, request new features, or give feedback. Alternatively, you can always write to [community@apify.com](mailto:community@apify.com). -![Paid Actors' issues tab](./images/store/paid-actors-issues-tab.png) +![Paid actors' issues tab](./images/store/paid-actors-issues-tab.png) ## Apify Store discounts diff --git a/sources/platform/collaboration/general-resource-access.md b/sources/platform/collaboration/general-resource-access.md index 5e3d695b74..4814f201b1 100644 --- a/sources/platform/collaboration/general-resource-access.md +++ b/sources/platform/collaboration/general-resource-access.md @@ -24,11 +24,11 @@ This setting affects the following resources: - Key-value stores - Request queues -Access to resources that require explicit access — such as Actors, tasks or schedules are not affected by this setting. +Access to resources that require explicit access - such as Actors, tasks or schedules are not affected by this setting. ![Setup account-level general resources access setting](./images/general-resouce-access//account-setting.png) -## How Restricted Access works +## How restricted access works If your **General resource access** is set to **Anyone with ID can read**, you can just send this link to anybody, and they will be able to download the data even if they don’t have an Apify account. However, once you change the setting to **Restricted**, this API call will require a valid token with access in order to work. In other words, you’ll have to explicitly share the dataset and you can only do that with people who have an Apify account. @@ -60,9 +60,9 @@ Even if your access is set to **Restricted** there are a few built-in exceptions #### Builds of public Actors -Builds of public Actors are always accessible to anyone who can view the Actor — regardless of the Actor owner’s account **General resource access** setting. +Builds of public Actors are always accessible to anyone who can view the Actor - regardless of the Actor owner's account **General resource access** setting. -This ensures that public Actors in Apify Store continue to work as expected. For example, if you open a public Actor in Console, you’ll also be able to view its build details, download logs, or inspect the source package — without needing extra permissions or a token. +This ensures that public Actors in Apify Store continue to work as expected. For example, if you open a public Actor in Console, you'll also be able to view its build details, download logs, or inspect the source package - without needing extra permissions or a token. This exception exists to maintain usability and avoid breaking workflows that rely on public Actors. It only applies to builds of Actors that are marked as **public**. For private Actors, build access still follows the general resource access setting of the owner’s account. @@ -73,7 +73,7 @@ When you share an Actor with a collaborator, you can choose to share read-only a - This access includes logs, input, and default storages (dataset, key-value store, request queue) - Access is one-way: you won’t see the collaborator’s runs unless they share them - Collaborators can’t see each other’s runs -- This works even if your account uses **restricted general resource access** — permissions are applied automatically. +- This works even if your account uses **restricted general resource access** - permissions are applied automatically. #### Automatically sharing runs with public Actor creators @@ -83,13 +83,13 @@ If you’re using a public Actor from Apify Store, you can choose to automatical - When enabled, your runs of public Actors are automatically visible to the Actor’s creator - Shared runs include logs, input, and output storages (dataset, key-value store, request queue) -This sharing works even if your account has **General resource access** set to **Restricted** — the platform applies specific permission checks to ensure the Actor creator can access only the relevant runs. +This sharing works even if your account has **General resource access** set to **Restricted** - the platform applies specific permission checks to ensure the Actor creator can access only the relevant runs. You can disable this behavior at any time by turning off the setting in your account. #### Automatically sharing runs via Actor Issues -When you report an issue on an Actor and include a **run URL**, that run is automatically shared with the Actor developer — **even if your account uses restricted general resource access**. +When you report an issue on an Actor and include a **run URL**, that run is automatically shared with the Actor developer - **even if your account uses restricted general resource access**. This automatic sharing ensures the developer can view all the context they need to troubleshoot the issue effectively. That includes: @@ -101,7 +101,7 @@ This automatic sharing ensures the developer can view all the context they need The access is granted through explicit, behind-the-scenes permissions (not anonymous or public access), and is limited to just that run and its related storages. No other resources in your account are affected. -This means you don’t need to manually adjust permissions or share multiple links when reporting an Actor issue — **just including the run URL in your issue is enough** +This means you don't need to manually adjust permissions or share multiple links when reporting an Actor issue - **just including the run URL in your issue is enough** ![Sharing a run link in create Actor issue dialog makes it accessible to the developer automatically](./images/general-resouce-access/creating-actor-issue.png) @@ -125,7 +125,7 @@ await datasetClient.update({ ### Sharing restricted resources with pre-signed URLs {#pre-signed-urls} -Even when a resource is restricted, you might still want to share it with someone outside your team — for example, to send a PDF report to a client, or include a screenshot in an automated email or Slack message. In these cases, _storage resources_ (like key-value stores, datasets, and request queues) support generating _pre-signed URLs_. These are secure, time-limited links that let others access individual files without needing an Apify account or authentication. +Even when a resource is restricted, you might still want to share it with someone outside your team - for example, to send a PDF report to a client, or include a screenshot in an automated email or Slack message. In these cases, _storage resources_ (like key-value stores, datasets, and request queues) support generating _pre-signed URLs_. These are secure, time-limited links that let others access individual files without needing an Apify account or authentication. #### How pre-signed URLs work @@ -237,7 +237,7 @@ If the `expiresInSecs` option is not specified, the generated link will be _perm #### Signing URLs manually -If you need finer control — for example, generating links without using Apify client — you can sign URLs manually using our reference implementation. +If you need finer control - for example, generating links without using Apify client - you can sign URLs manually using our reference implementation. [Check the reference implementation in Apify clients](https://github.com/apify/apify-client-js/blob/5efd68a3bc78c0173a62775f79425fad78f0e6d1/src/resource_clients/dataset.ts#L179) @@ -301,7 +301,7 @@ const recordUrl = `https://api.apify.com/v2/key-value-stores/${storeId}/records/ const storeClient = client.keyValueStore(storeId); const recordUrl = await storeClient.getRecordPublicUrl(recordKey); -// Save pre-signed URL — accessible without authentication +// Save pre-signed URL - accessible without authentication await Actor.pushData({ recordUrl }); ``` @@ -324,7 +324,7 @@ You can easily test this by switching your own account’s setting to _Restricte :::tip Make sure links work as expected -Once you’ve enabled restricted access, run your Actor and confirm that all links generated in logs, datasets, key-value stores, and status messages remain accessible as expected. Make sure any shared URLs — especially those stored in results or notifications — work without requiring an API token. +Once you've enabled restricted access, run your Actor and confirm that all links generated in logs, datasets, key-value stores, and status messages remain accessible as expected. Make sure any shared URLs - especially those stored in results or notifications - work without requiring an API token. ::: diff --git a/sources/platform/collaboration/organization_account/how_to_use.md b/sources/platform/collaboration/organization_account/how_to_use.md index ee54ae198b..3f78448e4a 100644 --- a/sources/platform/collaboration/organization_account/how_to_use.md +++ b/sources/platform/collaboration/organization_account/how_to_use.md @@ -22,7 +22,7 @@ You can switch into **Organization account** view using the account button in th ![Switch to organization account](../images/organizations/switch-to-organization.png) -In the menu, the account you are currently using is displayed at the top, with all the accounts you can switch to displayed below. When you need to get back to your personal account, you can just switch right back to it—no need to log in and out. +In the menu, the account you are currently using is displayed at the top, with all the accounts you can switch to displayed below. When you need to get back to your personal account, you can just switch right back to it - no need to log in and out. The resources you can access and account details you can edit will depend on your [permissions](../list_of_permissions.md) in the organization. diff --git a/sources/platform/console/index.md b/sources/platform/console/index.md index 78293381f6..d0c1e043bd 100644 --- a/sources/platform/console/index.md +++ b/sources/platform/console/index.md @@ -56,7 +56,7 @@ In case you forgot your password, you can click on the **Forgot your password?** ![Apify Console forgotten password page](./images/console-forgotten-password-page.png) -## Adding different authentication methods +## Add different authentication methods After you create your account, you might still want to use the other authentication methods. To do that, go to the [Login & Privacy](https://console.apify.com/settings/security) section of your account settings. There, you will see all available authentication methods and their configuration. diff --git a/sources/platform/console/two-factor-authentication.md b/sources/platform/console/two-factor-authentication.md index 0061735f77..75f8f88e55 100644 --- a/sources/platform/console/two-factor-authentication.md +++ b/sources/platform/console/two-factor-authentication.md @@ -14,7 +14,7 @@ If you use your email and password to sign in to Apify Console, you can enable t Some organizations might require two-factor authentication (2FA) to access their resources. Members of such an organization, must enable 2FA on their account in order to continue accessing shared resources and maintain compliance with their security policies. -## Setting up two-factor authentication +## Set up two-factor authentication To set up two-factor authentication, go to the [Login & Privacy](https://console.apify.com/settings/security) section of your account settings. There, look for the **Two-factor authentication** section. Currently, there is only one option, which is the **Authenticator app**. If you have two-factor authentication already enabled, there will be a label **enabled** next to it. @@ -36,7 +36,7 @@ A new pop-up window will appear where you can copy the two-factor `secret` key, After you scan the QR code or set up your app manually, the app will generate a code that you need to enter into the **Verify the code from the app** field. After you enter the code, click on the **Continue** button to get to the next step of the setup process. -### Recovery settings +### Set up recovery settings ![Apify Console setup two-factor authentication - recovery codes](./images/console-two-factor-recovery-setup.png) @@ -63,7 +63,7 @@ After you enable two-factor authentication, the next time you attempt to sign in ![Apify Console two-factor authentication form](./images/console-two-factor-authentication.png) -## Using recovery codes +## Use recovery codes In case you lose access to your authenticator app, you can use the recovery codes to sign in to your account. To do that, click on the **recovery code or begin 2FA account recovery** link below the **Verify** button. This will redirect you to a view similar to the current one, but instead of code from the authenticator app, you will need to enter one of the 16 recovery codes you received during the setup process. @@ -76,7 +76,7 @@ When you successfully use a recovery code, we remove the code from the original ![Apify Console two-factor authentication with recovery code form](./images/console-two-factor-use-recovery-code.png) -## Disabling two-factor authentication +## Disable two-factor authentication If you no longer want to use the two-factor authentication or lose access to your authenticator app, you can disable the two-factor authentication in the [Login & Privacy](https://console.apify.com/settings/security) section of your. See the **Two-factor authentication** section and click on the **Disable** button. We will ask you to enter either your verification code from the authenticator app or, if you do not have access to it anymore, you can use one of your recovery codes. After entering the code, click on the **Remove app** button to verify the provided code. If it's valid, it will disable the two-factor authentication and remove the configuration from your account. @@ -90,7 +90,7 @@ If you lose access to your authenticator app and do not have any recovery codes For our support team to help you recover your account, you will need to provide them with the personal information you have configured during the two-factor authentication setup. If you provide the correct information, the support team will help you regain access to your account. -:::caution +:::caution Support verification The support team will not give you any clues about the information you provided; they will only verify if it is correct. ::: diff --git a/sources/platform/index.mdx b/sources/platform/index.mdx index 63eab3a5f8..8a64ceda9e 100644 --- a/sources/platform/index.mdx +++ b/sources/platform/index.mdx @@ -13,7 +13,7 @@ import homepageContent from "./homepage_content.json"; **Apify** is a cloud platform and marketplace for web data extraction and automation tools called **Actors**. -## Getting started +## Get started Learn how to run any Actor in Apify Store or create your own. A step-by-step guides through your first steps on the Apify platform. diff --git a/sources/platform/integrations/actors/index.md b/sources/platform/integrations/actors/index.md index 18435148fd..a639ccc7aa 100644 --- a/sources/platform/integrations/actors/index.md +++ b/sources/platform/integrations/actors/index.md @@ -37,7 +37,7 @@ This leads you to a setup screen, where you can provide: - **Input for the integrated Actor**: Typically, the input has two parts. The information that is independent of the run triggering it and information that is specific for that run. The "independent" information (e.g. connection string to database or table name) can be added to the input as is. The information specific to the run (e.g. dataset ID) is either obtained from the implicit `payload` field (this is the case for most Actors that are integration-ready), or they can be provided using variables. - **Available variables** are the same ones as in webhooks. The one that you probably are going to need the most is `{{resource}}`, which is the Run object in the same shape you get from the [API](/api/v2/actor-run-get) (for build event types, it will be the Build object). The variables can make use of dot notation, so you will most likely just need `{{resource.defaultDatasetId}}` or `{{resource.defaultKeyValueStoreId}}`. -## Testing your integration +## Test your integration When adding a new integration, you can test it using a past run or build as a trigger. This will trigger a run of your target Actor or task as if your desired trigger event just occurred. The only difference between a test run and regular run is that the trigger's event type will be set to 'TEST'. The test run will still consume compute units. diff --git a/sources/platform/integrations/actors/integration_ready_actors.md b/sources/platform/integrations/actors/integration_ready_actors.md index b4f0e66fa2..4db22d6adb 100644 --- a/sources/platform/integrations/actors/integration_ready_actors.md +++ b/sources/platform/integrations/actors/integration_ready_actors.md @@ -88,7 +88,7 @@ const datasetIdToProcess = datasetId || payload?.resource?.defaultDatasetId; In the above example, we're focusing on accessing a run's default dataset, but the approach would be similar for any other field. -## Making your Actor available to other users +## Make your Actor available to other users To allow other users to use your Actor as an integration, all you need to do is [publish it in Apify Store](/platform/actors/publishing), so users can then integrate it using the **Connect Actor or task** button on the **Integrations** tab of any Actor. While publishing the Actor is enough, there are two ways to make it more visible to users. diff --git a/sources/platform/integrations/ai/aws_bedrock.md b/sources/platform/integrations/ai/aws_bedrock.md index 04d3cf9ccb..6ff61c2936 100644 --- a/sources/platform/integrations/ai/aws_bedrock.md +++ b/sources/platform/integrations/ai/aws_bedrock.md @@ -42,7 +42,7 @@ The following image illustrates the key components of an AWS Bedrock AI agent: ![AWS-Bedrock-AI-Agent](../images/aws-bedrock-ai-agent.png) -### Building an Agent +### Build an Agent To begin, open the Amazon Bedrock console and select agents from the left navigation panel. On the next screen, click Create agent to start building your agent. diff --git a/sources/platform/integrations/ai/chatgpt.md b/sources/platform/integrations/ai/chatgpt.md index 9ab1176c99..96b46addef 100644 --- a/sources/platform/integrations/ai/chatgpt.md +++ b/sources/platform/integrations/ai/chatgpt.md @@ -65,7 +65,7 @@ Once your connector is ready: > “Search the web and summarize recent trends in AI agents” You’ll need to grant permission for each Apify tool when it’s used for the first time. -You should see ChatGPT calling Apify tools — such as the [RAG Web Browser](https://apify.com/apify/rag-web-browser) — to gather information. +You should see ChatGPT calling Apify tools - such as the [RAG Web Browser](https://apify.com/apify/rag-web-browser) - to gather information. ![ChatGPT Apify tools](../images/chatgpt-with-rag-web-browser.png) diff --git a/sources/platform/integrations/ai/crewai.md b/sources/platform/integrations/ai/crewai.md index 1e7f6336a4..c248bd3983 100644 --- a/sources/platform/integrations/ai/crewai.md +++ b/sources/platform/integrations/ai/crewai.md @@ -12,7 +12,7 @@ slug: /integrations/crewai ## What is CrewAI -[CrewAI](https://www.crewai.com/) is an open-source Python framework designed to orchestrate autonomous, role-playing AI agents that collaborate as a "crew" to tackle complex tasks. It enables developers to define agents with specific roles, assign tasks, and integrate tools—like Apify Actors—for real-world data retrieval and automation. +[CrewAI](https://www.crewai.com/) is an open-source Python framework designed to orchestrate autonomous, role-playing AI agents that collaborate as a "crew" to tackle complex tasks. It enables developers to define agents with specific roles, assign tasks, and integrate tools - like Apify Actors - for real-world data retrieval and automation. :::note Explore CrewAI @@ -34,7 +34,7 @@ This guide demonstrates how to integrate Apify Actors with CrewAI by building a pip install 'crewai[tools]' langchain-apify langchain-openai ``` -### Building the TikTok profile search and analysis crew +### Build the TikTok profile search and analysis crew First, import all required packages: diff --git a/sources/platform/integrations/ai/flowise.md b/sources/platform/integrations/ai/flowise.md index fbffcfa60f..97f04edbb4 100644 --- a/sources/platform/integrations/ai/flowise.md +++ b/sources/platform/integrations/ai/flowise.md @@ -36,7 +36,7 @@ It will be available on `https://localhost:3000` Other methods of using Flowise can be found in their [documentation](https://docs.flowiseai.com/getting-started#quick-start) -### Building your flow +### Build your flow After running Flowise, you can start building your flow with Apify. diff --git a/sources/platform/integrations/ai/langflow.md b/sources/platform/integrations/ai/langflow.md index d0f0852269..c195dd2dcf 100644 --- a/sources/platform/integrations/ai/langflow.md +++ b/sources/platform/integrations/ai/langflow.md @@ -57,14 +57,14 @@ When the platform is started, open the Langflow UI using `http://127.0.0.1:7860` > Other installation methods can be found in the [Langflow documentation](https://docs.langflow.org/get-started-installation). -### Creating a new flow +### Create a new flow On the Langflow welcome screen, click the **New Flow** button and then create **Blank Flow**: ![New Flow screen - Blank Flow](../images/langflow/new_blank_flow.png) Now, you can start building your flow. -### Calling Apify Actors in Langflow +### Call Apify Actors in Langflow To call Apify Actors in Langflow, you need to add the **Apify Actors** component to the flow. From the bundle menu, add **Apify Actors** component: @@ -98,7 +98,7 @@ When you run the component again, the output contains only the `markdown` and fl Now that you understand how to call Apify Actors, let's build a practical example where you search for a company's social media profiles and extract data from them. -### Building a flow to search for a company's social media profiles +### Build a flow to search for a company's social media profiles Create a new flow and add two **Apify Actors** components from the menu. diff --git a/sources/platform/integrations/ai/langgraph.md b/sources/platform/integrations/ai/langgraph.md index 27d76d9e41..cd7be22418 100644 --- a/sources/platform/integrations/ai/langgraph.md +++ b/sources/platform/integrations/ai/langgraph.md @@ -36,7 +36,7 @@ This guide will demonstrate how to use Apify Actors with LangGraph by building a pip install langgraph langchain-apify langchain-openai ``` -### Building the TikTok profile search and analysis agent +### Build the TikTok profile search and analysis agent First, import all required packages: diff --git a/sources/platform/integrations/ai/lindy.md b/sources/platform/integrations/ai/lindy.md index 6275507d56..76846c5c95 100644 --- a/sources/platform/integrations/ai/lindy.md +++ b/sources/platform/integrations/ai/lindy.md @@ -58,7 +58,7 @@ You have access to thousands of Actors available on the [Apify Store](https://ap This establishes the fundamental workflow:
_Chatting with Lindy can now trigger the Apify Instagram Profile Scraper._ -### Extending Your Workflow +### Extending your workflow Lindy offers different triggers (e.g., _email received_, _Slack message received_, etc.) and actions beyond running an Actor. diff --git a/sources/platform/integrations/ai/mastra.md b/sources/platform/integrations/ai/mastra.md index 7c666d73d2..d2ddf2ba25 100644 --- a/sources/platform/integrations/ai/mastra.md +++ b/sources/platform/integrations/ai/mastra.md @@ -39,7 +39,7 @@ This guide demonstrates how to integrate Apify Actors with Mastra by building an npm install @mastra/core @mastra/mcp @ai-sdk/openai ``` -### Building the TikTok profile search and analysis agent +### Build the TikTok profile search and analysis agent First, import all required packages: @@ -147,7 +147,7 @@ You will see the agent’s output in the console, showing the results of the sea Connecting to Mastra MCP server... Fetching tools... Generating response for prompt: Search the web for the OpenAI TikTok profile URL, then extract and summarize its data. -### OpenAI TikTok Profile Summary +### OpenAI TikTok profile summary - **Profile URL**: [OpenAI on TikTok](https://www.tiktok.com/@openai?lang=en) - **Followers**: 608,100 - **Likes**: 3.4 million - **Videos Posted**: 156 diff --git a/sources/platform/integrations/ai/openai_agents.md b/sources/platform/integrations/ai/openai_agents.md index 1ec6ce8f26..277280da57 100644 --- a/sources/platform/integrations/ai/openai_agents.md +++ b/sources/platform/integrations/ai/openai_agents.md @@ -24,7 +24,7 @@ Before integrating Apify with OpenAI Agents SDK, you'll need: pip install agents openai ``` -## Building a web search agent with Apify MCP +## Build a web search agent with Apify MCP You can connect to the Apify MCP server using streamable HTTP with Bearer token authentication. Use your Apify API token by setting the `Authorization: Bearer ` header in the MCP server configuration. @@ -84,7 +84,7 @@ The agent may take some time (seconds or even minutes) to execute tool calls, es ::: -### Using specific Actors +### Use specific Actors You can configure the Apify MCP server to expose specific Actors by including them in the URL query parameters. For example, to use an Instagram scraper: diff --git a/sources/platform/integrations/ai/skyfire.md b/sources/platform/integrations/ai/skyfire.md index 20e1f4a38c..5b464c68b1 100644 --- a/sources/platform/integrations/ai/skyfire.md +++ b/sources/platform/integrations/ai/skyfire.md @@ -25,7 +25,7 @@ Keep in mind that agentic payments are an experimental feature and may undergo s With Skyfire integration, agents can discover available Apify Actors, execute scraping and automation tasks, and pay for services using pre-funded Skyfire tokens, all without human intervention. -## Using Skyfire with Apify MCP Server +## Use Skyfire with Apify MCP Server The [Apify MCP server](https://docs.apify.com/platform/integrations/mcp) provides the simplest way for agents to access Apify's Actor library using Skyfire payments. @@ -136,7 +136,7 @@ See which Actors [support agentic payments](#supported-actors). When not pre-loading Actors, agents can discover suitable Actors dynamically using the search tools. The search automatically filters results to show only Actors that support agentic payments. -## Using Skyfire with Apify API +## Use Skyfire with Apify API For direct API integration, you can use Skyfire PAY tokens to authenticate and pay for Actor runs. @@ -154,7 +154,7 @@ Instead of using a traditional Apify API token, pass your Skyfire PAY token in t skyfire-pay-id: YOUR_SKYFIRE_PAY_TOKEN ``` -### Running an Actor +### Run an Actor Make a standard Actor run request to the [run Actor endpoint](https://docs.apify.com/api/v2#/reference/actors/run-collection/run-actor), but include the Skyfire PAY token in the header. diff --git a/sources/platform/integrations/ai/vercel-ai-sdk.md b/sources/platform/integrations/ai/vercel-ai-sdk.md index 37a773b518..5fb2d22727 100644 --- a/sources/platform/integrations/ai/vercel-ai-sdk.md +++ b/sources/platform/integrations/ai/vercel-ai-sdk.md @@ -34,7 +34,7 @@ Apify is a marketplace of ready-to-use web scraping and automation tools, AI age npm install @modelcontextprotocol/sdk @openrouter/ai-sdk-provider ai ``` -### Building a simple pub search AI agent using Apify Google Maps scraper +### Build a simple pub search AI agent using Apify Google Maps scraper First, import all required packages: diff --git a/sources/platform/integrations/data-storage/airtable/index.md b/sources/platform/integrations/data-storage/airtable/index.md index d2551837ad..685c9235c7 100644 --- a/sources/platform/integrations/data-storage/airtable/index.md +++ b/sources/platform/integrations/data-storage/airtable/index.md @@ -90,7 +90,7 @@ Retrieve items from any Apify dataset and import them into your Airtable base wi This section explains how to map your Actor run results or dataset items into your Airtable base. -#### Understanding mapping rows +#### Understand mapping rows The Apify extension provides UI elements that allow you to map dataset fields to Airtable fields. @@ -132,7 +132,7 @@ _How it works_: For a source field like `crawl.depth`, the extension checks for To prevent duplicate records, select a **Unique ID** on the data mapping step. The unique ID is added to the list of mapping rows. Ensure it points to the correct field in your table. During import, the extension filters data by existing values in the table. ![Select unique ID](../../images/airtable/airtable_unique_id.png) -#### Preview Mapped Data +#### Preview mapped data Preview the results and start the import diff --git a/sources/platform/integrations/data-storage/keboola.md b/sources/platform/integrations/data-storage/keboola.md index e75ba93576..fbedc427a9 100644 --- a/sources/platform/integrations/data-storage/keboola.md +++ b/sources/platform/integrations/data-storage/keboola.md @@ -21,7 +21,7 @@ To use the Apify integration on Keboola, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Keboola account](https://www.keboola.com/). -### Step 1: Create a new Data Source in Keboola +### Step 1: create a new data source in Keboola Once your Keboola account is ready and you are logged in, navigate to the **Components** section in the top menu and click the **Add Component** button. @@ -35,7 +35,7 @@ Provide a name and description for your configuration, then click the **Create C ![Keboola configuration setup](../images/keboola/keboola-create-configuration.png) -### Step 2: Configure the Apify Data Source +### Step 2: configure the Apify data source With the new configuration created, you can now configure the data source to retrieve the needed data. Click on the **Configure Component** button to begin the setup process. @@ -75,7 +75,7 @@ Once you have filled in all the necessary options, click the **Save** button to ![Keboola component specification setup](../images/keboola/keboola-setup-specification.png) -### Step 3: Run the configured Data Source +### Step 3: run the configured data source After your data source has been configured, you can run it by clicking the **Run** button in the upper-right corner of your configuration. diff --git a/sources/platform/integrations/integrate_with_apify.md b/sources/platform/integrations/integrate_with_apify.md index a99f6176e3..c43b22b4c6 100644 --- a/sources/platform/integrations/integrate_with_apify.md +++ b/sources/platform/integrations/integrate_with_apify.md @@ -34,7 +34,7 @@ Actor-specific integrations are designed for targeted use cases. While they work For more examples both general and Actor-specific, check [integrations](./index.mdx). -## Integrating with Apify +## Integrate with Apify To integrate your service with Apify, you have two options: @@ -43,11 +43,11 @@ To integrate your service with Apify, you have two options: ![Integration-ready Actors](./images/integration-ready-actors.png) -### Building an integration Actor +### Build an integration Actor One way to reach out to Apify users is directly within [Apify Console](https://console.apify.com). To do that, you need to build an integrable Actor that can be piped into other Actors to upload existing data into a database. This can then be easily configured within Apify Console. Follow the [guide on building integration-ready Actors](./actors/integration_ready_actors.md). -### Building an external integration +### Build an external integration An alternative way is to let your users manage the connection directly on your side using [Apify API](https://docs.apify.com/api/v2) and our API clients for [JavaScript](/api/client/js/) or [Python](/api/client/python/). This way, users can manage the connection directly from your service. @@ -155,7 +155,7 @@ Users create their own Apify accounts and are billed directly by Apify for their Users access Apify through your platform without needing an Apify account. Apify bills you based on consumption, and you factor costs into your pricing. -### Monitoring and tracking +### Monitor and tracking To help Apify monitor and support your integration, every API request should identify your platform. You can do this in one of two ways: diff --git a/sources/platform/integrations/programming/api.md b/sources/platform/integrations/programming/api.md index 6923b2d64f..20bb1762f6 100644 --- a/sources/platform/integrations/programming/api.md +++ b/sources/platform/integrations/programming/api.md @@ -16,7 +16,7 @@ If you want to use the Apify API from JavaScript/Node.js or Python, we strongly - [**apify-client**](/api/client/js/) `npm` package for JavaScript, supporting both browser and server - [**apify-client**](/api/client/python/) PyPI package for Python. -You are not required to those packages—the REST API works with any HTTP client—but the official API clients implement best practices such as exponential backoff and rate limiting. +You are not required to those packages - the REST API works with any HTTP client - but the official API clients implement best practices such as exponential backoff and rate limiting. ## API token @@ -24,7 +24,7 @@ To access the Apify API in your integrations, you need to authenticate using you ![Integrations page in Apify Console](../images/api-token.png) -:::caution +:::caution Protect your API token Do not share the API token with untrusted parties, or use it directly from client-side code, unless you fully understand the consequences! You can also consider [limiting the permission scope](#limited-permissions) of the token, so that it can only access what it really needs. ::: @@ -69,7 +69,7 @@ By default, tokens can access all data in your account. If that is not desirable **A scoped token can access only those resources that you'll explicitly allow it to.** -:::info +:::info Actor modification restrictions We do not allow scoped tokens to create or modify Actors. If you do need to create or modify Actors through Apify API, use an unscoped token. ::: @@ -89,19 +89,19 @@ We support two different types of permissions for tokens: - **Resource-specific permissions**: These will apply only to specific, existing resources. For example, you can use these to allow the token to read from a particular dataset. -:::tip +:::tip Combine permission types A single token can combine both types. You can create a token that can _read_ any data storage, but _write_ only to one specific key-value store. ::: ![An example scoped token that combines account-level permissions and resource-specific permissions](../images/api-token-scoped-with-combining-permissions.png) -### Allowing tokens to create resources +### Allow tokens to create resources If you need to create new resources with the token (for example, create a new task, or storage), you need to explicitly allow that as well. Once you create a new resource with the token, _the token will gain full access to that resource_, regardless of other permissions. It is not possible to create a token that can create a dataset, but not write to it. -:::tip +:::tip Dynamic resource creation This is useful if you want to for example create a token that can dynamically create & populate datasets, but without the need to access other datasets in your account. ::: @@ -124,7 +124,7 @@ Specifically: - To create or update a Schedule, the token needs access not only to the Schedule itself, but also to the Actor (the **Run** permission) or task (the **Read** permission) that is being scheduled. - Similarly, to create, update or run a task, the token needs the **Run** permission on the task's Actor itself. -:::tip +:::tip Schedule creation example Let's say that you have an Actor and you want to programmatically create schedules for that Actor. Then you can create a token that has the account level **Create** permission on schedules, but only the resource-specific **Run** permission on the Actor. Such a token has exactly the permissions it needs, and nothing more. ::: @@ -147,7 +147,7 @@ When you run an Actor with a scoped token in this mode, Apify will inject an _un This way you can be sure that once you give a token the permission to run an Actor, it will just work, and you don't have to worry about the exact permissions the Actor might need. However, this also means that you need to trust the Actor. -:::tip +:::tip Third-party integration Use this mode if you want to integrate with a 3rd-party service to run your Actors. Create a scoped token that can only run the Actor you need, and share it with the service. Even if the token is leaked, it can't be used to access your other data. ::: @@ -155,12 +155,13 @@ Use this mode if you want to integrate with a 3rd-party service to run your Acto When you run an Actor with a scoped token in this mode, Apify will inject a token with the same scope as the scope of the original token. -This way you can be sure that Actors won't accidentally—or intentionally—access any data they shouldn't. However, Actors might not function properly if the scope is not sufficient. +This way you can be sure that Actors won't accidentally - or intentionally - access any data they shouldn't. However, Actors might not function properly if the scope is not sufficient. -:::caution +:::caution Standby mode limitation Restricted access mode is not supported for Actors running in [Standby mode](/platform/actors/running/standby). While you can send standby requests using a scoped token configured with restricted access, functionality is not guaranteed. +::: -:::tip +:::tip Transitive restrictions This restriction is _transitive_, which means that if the Actor runs another Actor, its access will be restricted as well. ::: @@ -180,7 +181,7 @@ If the toggle is **off**, the token can still trigger and inspect runs, but acce - For accounts with **Unrestricted general resource access**, the default storages can still be read anonymously using their IDs, but writing is prevented. -:::tip +:::tip Clean up run data Let's say your Actor produces a lot of data that you want to delete just after the Actor finishes. If you enable this toggle, your scoped token will be allowed to do that. ::: @@ -198,11 +199,11 @@ If you set up a webhook pointing to the Apify API, the Apify platform will autom Therefore, you need to make sure the token has sufficient permissions not only to set up the webhook, but also to perform the actual operation. -:::tip +:::tip Webhook permissions Let's say you want to create a webhook that pushes an item to a dataset every time an Actor successfully finishes. Then such a scoped token needs to be allowed to both run the Actor (to create the webhook), and write to that dataset. ::: -### Troubleshooting +### Troubleshoot scoped tokens #### How do I allow a token to run a task? diff --git a/sources/platform/integrations/workflows-and-notifications/bubble.md b/sources/platform/integrations/workflows-and-notifications/bubble.md index 33292e9049..3facf21af5 100644 --- a/sources/platform/integrations/workflows-and-notifications/bubble.md +++ b/sources/platform/integrations/workflows-and-notifications/bubble.md @@ -63,7 +63,7 @@ When configuring Apify actions in a workflow (check out screenshot below), set t - ![Current User's API token](../images/bubble/data_select_user_api_key.png) -## Using the integration +## Use the integration Once the plugin is configured, you can start building automated workflows. @@ -166,8 +166,8 @@ There are two common approaches: ### Display data - This example appends the text result of an Actor run; it's a basic bind to the element’s text. -- Create / select the UI visual element — in this example, `Text`. -- In the Appearance tab, click the input area, select Insert dynamic data, and, according to your case, find the source — in this example, it's the `key_value_storages's recordContentText` custom state, where I set the result of the API call +- Create / select the UI visual element - in this example, `Text`. +- In the Appearance tab, click the input area, select Insert dynamic data, and, according to your case, find the source - in this example, it's the `key_value_storages's recordContentText` custom state, where I set the result of the API call - ![Display text data](../images/bubble/text_dynamic_content.png) ### Display list of data @@ -175,13 +175,13 @@ There are two common approaches: - This example lists the current user's datasets and displays them in a repeating group. - Add a **Repeating group** to the page. 1. Add data to a variable: create a custom state (for example, on the page) that will hold the list of datasets, and set it to the plugin's **List User Datasets** data call. - - ![Step 1 — Set variable with user's datasets](../images/bubble/user_dataset_repeating_group_set.png) + - ![Step 1 - Set variable with user's datasets](../images/bubble/user_dataset_repeating_group_set.png) 1. Set the type: in the repeating group's settings, set **Type of content** to match the dataset object your variable returns. - - ![Step 2 — Repeating group type of content](../images/bubble/user_dataset_repeating_group.png) + - ![Step 2 - Repeating group type of content](../images/bubble/user_dataset_repeating_group.png) 1. Bind the variable: set the repeating group's **Data source** to the variable from Step 1. - - ![Step 3 — Repeating group data source](../images/bubble/user_dataset_repeating_group_source.png) + - ![Step 3 - Repeating group data source](../images/bubble/user_dataset_repeating_group_source.png) - Inside the repeating group cell, bind dataset fields (for example, `Current cell's item name`, `id`, `createdAt`). -- ![Step 4 — Repeating group data cell](../images/bubble/user_dataset_repeating_group_cell.png) +- ![Step 4 - Repeating group data cell](../images/bubble/user_dataset_repeating_group_cell.png) ## Long‑running scrapes and Bubble time limits (async pattern) diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/index.md b/sources/platform/integrations/workflows-and-notifications/gumloop/index.md index 923029920c..c7b21f60cd 100644 --- a/sources/platform/integrations/workflows-and-notifications/gumloop/index.md +++ b/sources/platform/integrations/workflows-and-notifications/gumloop/index.md @@ -38,7 +38,7 @@ Each tool has a corresponding Gumloop credit cost. Each Gumloop subscription com | Get videos for a specific hashtag | Get Hashtag Videos | 3 credits/video | | Show 5 most recent reviews for a restaurant | Get Place Reviews | 3 credits/review | -## General integration (Apify Task Runner) +## General integration (Apify task runner) Gumloop's Apify task runner lets you run your Apify tasks directly inside Gumloop workflows. Scrape data with Apify, then process it with AI, send results via email, update spreadsheets, or connect to any of Gumloop's 100+ integrations. diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md index 61e10bd92f..92bab2f59d 100644 --- a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md +++ b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md @@ -22,10 +22,10 @@ You can pull the following types of data from TikTok using Gumloop’s TikTok no | Get hashtag videos | Fetch videos from TikTok hashtags with captions, engagement metrics, play counts, and author information. | 3 credits per item | | Get profile videos | Get videos from TikTok user profiles with video metadata, engagement stats, music info, and timestamps. | 3 credits per item | | Get profile followers | Retrieve followers or following lists from TikTok profiles, including usernames, follower counts, and bios. | 3 credits per item | -| Get video details | Get comprehensive data on a specific TikTok video using its URL—includes engagement and video-level metrics. | 5 credits per item | +| Get video details | Get comprehensive data on a specific TikTok video using its URL - includes engagement and video-level metrics. | 5 credits per item | | Search videos | Search TikTok for videos and users using queries. Returns video details and user profile info. | 3 credits per item | -## Retrieve Tiktok Data in Gumloop +## Retrieve tiktok data in Gumloop 1. _Add the Gumloop TikTok MCP node_ diff --git a/sources/platform/integrations/workflows-and-notifications/kestra.md b/sources/platform/integrations/workflows-and-notifications/kestra.md index dec808cd5d..e4efca90e5 100644 --- a/sources/platform/integrations/workflows-and-notifications/kestra.md +++ b/sources/platform/integrations/workflows-and-notifications/kestra.md @@ -1,16 +1,16 @@ --- title: Kestra integration -description: Connect Apify with Kestra to orchestrate workflows — run flows, extract structured data, and react to Actor or task events. +description: Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events. sidebar_label: Kestra sidebar_position: 7 slug: /integrations/kestra --- -**Connect Apify with Kestra to orchestrate workflows — run flows, extract structured data, and react to Actor or task events.** +**Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events.** --- -[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data — all defined declaratively in YAML and orchestrated directly from the UI. +[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data - all defined declaratively in YAML and orchestrated directly from the UI. This guide shows you how to set up the integration, configure authentication, and create a workflow that runs an Actor and processes its results. diff --git a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md index d1a08ebcf6..f487a70a6b 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md +++ b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md @@ -35,11 +35,11 @@ To use these modules, you need an [Apify account](https://console.apify.com) and Once connected, you can build workflows to automate website extraction and integrate results into your AI applications. -## Apify Scraper for Website Content modules +## Apify Scraper for website content modules After connecting the app, you can use one of the two modules as native scrapers to extract website content. -### Standard Settings Module +### Standard settings module The Standard Settings module is a streamlined component of the Website Content Crawler that allows you to quickly extract content from websites using optimized default settings. This module is perfect for extracting content from blogs, documentation sites, knowledge bases, or any text-rich website to feed into AI models. @@ -95,7 +95,7 @@ For each crawled web page, you'll receive: } ``` -### Advanced Settings Module +### Advanced settings module The Advanced Settings module provides complete control over the content extraction process, allowing you to fine-tune every aspect of the crawling and transformation pipeline. This module is ideal for complex websites, JavaScript-heavy applications, or when you need precise control over content extraction. diff --git a/sources/platform/integrations/workflows-and-notifications/make/amazon.md b/sources/platform/integrations/workflows-and-notifications/make/amazon.md index 9743f62744..cba0b527c1 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/amazon.md +++ b/sources/platform/integrations/workflows-and-notifications/make/amazon.md @@ -113,7 +113,7 @@ For Amazon URLs, you can extract: "reviewsCount": 107637, "thumbnailImage": "https://m.media-amazon.com/images/I/61gSpxZTZZL.__AC_SX300_SY300_QL70_ML2_.jpg", "breadCrumbs": "Electronics›Computers & Accessories›Computer Accessories & Peripherals›Keyboards, Mice & Accessories›Keyboard & Mouse Combos", - "description": "The stylish Logitech MK270 Wireless Keyboard and Mouse Combo is perfect for the home office or workplace. Ditch the touchpad for this full size keyboard and mouse. Easily connect using Logitech's plug and forget receiver—just plug it into the USB port, and you're ready to work. There's no lengthy installation procedure to slow you down. When you're on the move, the receiver stores comfortably inside the mouse. Both the keyboard and mouse included in the MK270 combo use wireless 2.4GHz connectivity to provide seamless, interruption free use. Use the keyboard within a 10 m range without keyboard lag. Work for longer with the MK270's long battery life. The keyboard can be used for up to 24 months, and the mouse for 12 months, without replacing batteries. The Logitech MK270 keyboard includes 8 hotkeys that are programmable to your most used applications to boost your productivity.", + "description": "The stylish Logitech MK270 Wireless Keyboard and Mouse Combo is perfect for the home office or workplace. Ditch the touchpad for this full size keyboard and mouse. Easily connect using Logitech's plug and forget receiver - just plug it into the USB port, and you're ready to work. There's no lengthy installation procedure to slow you down. When you're on the move, the receiver stores comfortably inside the mouse. Both the keyboard and mouse included in the MK270 combo use wireless 2.4GHz connectivity to provide seamless, interruption free use. Use the keyboard within a 10 m range without keyboard lag. Work for longer with the MK270's long battery life. The keyboard can be used for up to 24 months, and the mouse for 12 months, without replacing batteries. The Logitech MK270 keyboard includes 8 hotkeys that are programmable to your most used applications to boost your productivity.", "price": { "value": 21.98, "currency": "$" diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md index 5943aca327..1a8bfd8e00 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/index.md +++ b/sources/platform/integrations/workflows-and-notifications/make/index.md @@ -65,7 +65,7 @@ The primary difference between the two methods is that the synchronous run waits In this example, we will demonstrate how to run an Actor synchronously and export the output to Google Sheets. The same principle applies to module that runs a task. -#### Step 1: Add the Apify "Run an Actor" Module +#### Step 1: add the Apify "Run an actor" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Run an Actor" to your scenario and configure it. @@ -75,7 +75,7 @@ Make sure to set the "Run synchronously" option to "Yes," so the module waits fo ![make-com-sync-2.png](../../images/make-com/make-com-sync-2.png) -#### Step 2: Add the Apify "Get Dataset Items" module +#### Step 2: add the Apify "Get dataset items" module In the next step, add the "Get Dataset Items" module to your scenario, which is responsible for retrieving the output data from the Actor run. @@ -84,7 +84,7 @@ You can find this dataset ID in the variables generated by the previous "Run an ![make-com-sync-3.png](../../images/make-com/make-com-sync-3.png) -#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" module +#### Step 3: add the Google sheets "Create spreadsheet rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario. This module will automatically create new rows in a Google Sheets file to store the Actor's output. @@ -99,7 +99,7 @@ You’re all set! Once the scenario is started, it will run the Actor synchronou In this example, we will demonstrate how to run an Actor asynchronously and export its output to Google Sheets. Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify console, on a schedule, or from a separate Make.com scenario. -#### Step 1: Add the Apify "Watch Actor Runs" Module +#### Step 1: add the Apify "Watch Actor runs" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Watch Actor Runs" to your scenario. This module will set up a webhook to listen for the finished runs of the selected Actor. @@ -108,7 +108,7 @@ For this example, we will use the "Google Maps Review Scraper" Actor. ![make-com-async-1.png](../../images/make-com/make-com-async-1.png) -#### Step 2: Add the Apify "Get Dataset Items" module +#### Step 2: add the Apify "Get dataset items" module Add the "Get Dataset Items" module to your scenario to retrieve the output of the Actor run. @@ -116,7 +116,7 @@ In the "Dataset ID" field, provide the default dataset ID from the Actor run. Yo ![make-com-async-2.png](../../images/make-com/make-com-async-2.png) -#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" module +#### Step 3: add the Google sheets "Create spreadsheet rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario, which will create new rows in the specified Google Sheets file to store the Actor's output. diff --git a/sources/platform/integrations/workflows-and-notifications/make/instagram.md b/sources/platform/integrations/workflows-and-notifications/make/instagram.md index 1c71f0d07f..9dd2170d38 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/instagram.md +++ b/sources/platform/integrations/workflows-and-notifications/make/instagram.md @@ -156,7 +156,7 @@ For each Instagram post, you will extract: "timestamp": "2024-11-08T17:30:07.000Z" }, { - "caption": "Take a deep breath...\n\nX-ray images from our Chandra X-ray Observatory helped astronomers confirm that most of the oxygen in the universe is synthesized in massive stars. So, everybody say \"thank you\" to supernova remnants (SNRs) like this one, which has enough oxygen for thousands of solar systems.\n\nSupernova remnants are, naturally, the remains of exploded stars. They're extremely important for understanding our galaxy. If it weren't for SNRs, there would be no Earth, no plants, animals, or people. This is because all the elements heavier than iron were made in a supernova explosion, so the only reason we find these elements on Earth or in our solar system — or any other extrasolar planetary system — is because those elements were formed during a supernova.\n\n@nasachandraxray's data is represented in this image by blue and purple, while optical data from @nasahubble and the Very Large Telescope in Chile are in red and green.\n\nImage description: The darkness of space is almost covered by the array of objects in this image. Stars of different sizes are strewn about, while a blue and red bubble of gas is at the center. An area of pink and green covers the bottom-right corner.\n\nCredit: X-ray (NASA/CXC/ESO/F.Vogt et al); Optical (ESO/VLT/MUSE), Optical (NASA/STScI)\n\n#NASA #Supernova #Space #Universe #Astronomy #Astrophotography #Telescope #Xray", + "caption": "Take a deep breath...\n\nX-ray images from our Chandra X-ray Observatory helped astronomers confirm that most of the oxygen in the universe is synthesized in massive stars. So, everybody say \"thank you\" to supernova remnants (SNRs) like this one, which has enough oxygen for thousands of solar systems.\n\nSupernova remnants are, naturally, the remains of exploded stars. They're extremely important for understanding our galaxy. If it weren't for SNRs, there would be no Earth, no plants, animals, or people. This is because all the elements heavier than iron were made in a supernova explosion, so the only reason we find these elements on Earth or in our solar system - or any other extrasolar planetary system - is because those elements were formed during a supernova.\n\n@nasachandraxray's data is represented in this image by blue and purple, while optical data from @nasahubble and the Very Large Telescope in Chile are in red and green.\n\nImage description: The darkness of space is almost covered by the array of objects in this image. Stars of different sizes are strewn about, while a blue and red bubble of gas is at the center. An area of pink and green covers the bottom-right corner.\n\nCredit: X-ray (NASA/CXC/ESO/F.Vogt et al); Optical (ESO/VLT/MUSE), Optical (NASA/STScI)\n\n#NASA #Supernova #Space #Universe #Astronomy #Astrophotography #Telescope #Xray", "ownerFullName": "NASA", "ownerUsername": "nasa", "url": "https://www.instagram.com/p/DBKBByizDHZ/", @@ -166,7 +166,7 @@ For each Instagram post, you will extract: "timestamp": "2024-10-15T19:27:29.000Z" }, { - "caption": "It’s giving rainbows and unicorns, like a middle school binder 🦄🌈 ⁣⁣\n⁣⁣\nMeet NGC 602, a young star cluster in the Small Magellanic Cloud (one of our satellite galaxies), where astronomers using @NASAWebb have found candidates for the first brown dwarfs outside of our galaxy. This star cluster has a similar environment to the kinds of star-forming regions that would have existed in the early universe—with very low amounts of elements heavier than hydrogen and helium. It’s drastically different from our own solar neighborhood and close enough to study in detail. ⁣⁣\n ⁣⁣\nBrown dwarfs are… not quite stars, but also not quite gas giant planets either. Typically they range from about 13 to 75 Jupiter masses. They are sometimes free-floating and not gravitationally bound to a star, like a planet would be. But they do share some characteristics with exoplanets, like storm patterns and atmospheric composition. ⁣⁣\n\n@NASAHubble showed us that NGC 602 harbors some very young low-mass stars; Webb is showing us how significant and extensive objects like brown dwarfs are in this cluster. Scientists are excited to better be able to understand how they form, particularly in an environment similar to the harsh conditions of the early universe.⁣⁣\n ⁣⁣\nRead more at the link in @ESAWebb’s bio. ⁣⁣\n ⁣⁣\nImage description: A two image swipe-through of a star cluster is shown inside a large nebula of many-coloured gas and dust. The material forms dark ridges and peaks of gas and dust surrounding the cluster, lit on the inner side, while layers of diffuse, translucent clouds blanket over them. Around and within the gas, a huge number of distant galaxies can be seen, some quite large, as well as a few stars nearer to us which are very large and bright.⁣⁣\n ⁣⁣\nImage Credit: ESA/Webb, NASA & CSA, P. Zeidler, E. Sabbi, A. Nota, M. Zamani (ESA/Webb)⁣⁣\n ⁣⁣\n#JWST #Webb #JamesWebbSpaceTelescope #NGC602 #browndwarf #space #NASA #ESA", + "caption": "It’s giving rainbows and unicorns, like a middle school binder 🦄🌈 ⁣⁣\n⁣⁣\nMeet NGC 602, a young star cluster in the Small Magellanic Cloud (one of our satellite galaxies), where astronomers using @NASAWebb have found candidates for the first brown dwarfs outside of our galaxy. This star cluster has a similar environment to the kinds of star-forming regions that would have existed in the early universe - with very low amounts of elements heavier than hydrogen and helium. It’s drastically different from our own solar neighborhood and close enough to study in detail. ⁣⁣\n ⁣⁣\nBrown dwarfs are… not quite stars, but also not quite gas giant planets either. Typically they range from about 13 to 75 Jupiter masses. They are sometimes free-floating and not gravitationally bound to a star, like a planet would be. But they do share some characteristics with exoplanets, like storm patterns and atmospheric composition. ⁣⁣\n\n@NASAHubble showed us that NGC 602 harbors some very young low-mass stars; Webb is showing us how significant and extensive objects like brown dwarfs are in this cluster. Scientists are excited to better be able to understand how they form, particularly in an environment similar to the harsh conditions of the early universe.⁣⁣\n ⁣⁣\nRead more at the link in @ESAWebb’s bio. ⁣⁣\n ⁣⁣\nImage description: A two image swipe-through of a star cluster is shown inside a large nebula of many-coloured gas and dust. The material forms dark ridges and peaks of gas and dust surrounding the cluster, lit on the inner side, while layers of diffuse, translucent clouds blanket over them. Around and within the gas, a huge number of distant galaxies can be seen, some quite large, as well as a few stars nearer to us which are very large and bright.⁣⁣\n ⁣⁣\nImage Credit: ESA/Webb, NASA & CSA, P. Zeidler, E. Sabbi, A. Nota, M. Zamani (ESA/Webb)⁣⁣\n ⁣⁣\n#JWST #Webb #JamesWebbSpaceTelescope #NGC602 #browndwarf #space #NASA #ESA", "ownerFullName": "NASA", "ownerUsername": "nasa", "url": "https://www.instagram.com/p/DBea8-8Jn2z/", diff --git a/sources/platform/integrations/workflows-and-notifications/make/maps.md b/sources/platform/integrations/workflows-and-notifications/make/maps.md index bfcf0fa87f..19d83db44f 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/maps.md +++ b/sources/platform/integrations/workflows-and-notifications/make/maps.md @@ -57,7 +57,7 @@ The Search with Categories module is a component of the Google Maps Leads Scrape - _Exact Name Matching_: Find businesses with exact or partial name matches - _Operational Status Filter_: Exclude temporarily or permanently closed businesses -#### How It Works +#### How it works The module allows you to combine category filtering with location parameters to discover relevant business leads, data mine reviews, or extract relevant Google Maps information. You can use categories alone or with specific search terms to create precisely targeted lead lists. @@ -125,7 +125,7 @@ Categories can be general (e.g., "restaurant") which includes all variations lik } ``` -### Search with Search Terms Module +### Search with Search terms module The Search Terms module is a component of the Google Maps Leads Scraper designed to discover and extract business leads by using specific search queries, similar to how you'd search on Google Maps directly. @@ -140,13 +140,13 @@ The Search Terms module is a component of the Google Maps Leads Scraper designed - _Exact Name Matching_: Find businesses with exact or partial name matches - _Operational Status Filter_: Exclude temporarily or permanently closed businesses -#### How It Works +#### How it works This module allows you to enter search terms that match what you would typically type into the Google Maps search bar. You can search for general business types (like "coffee shop"), specific services ("dog grooming"), or product offerings ("organic produce"). The search results can be further refined using optional category filters, which help ensure you're capturing precisely the type of businesses you're targeting. For maximum efficiency, you can combine broader search terms with strategic category filters to capture the most relevant leads without excluding valuable prospects. -### Advanced and Custom Search Module - Google Maps Leads Scraper +### Advanced and custom Search module - Google Maps leads Scraper The Advanced and Custom Search module is the most powerful component of the Google Maps Leads Scraper, designed for sophisticated lead generation campaigns that require precise geographic targeting and advanced search capabilities. This module gives you complete control over your lead discovery process with multiple location definition methods and advanced filtering options. @@ -159,18 +159,18 @@ The Advanced and Custom Search module is the most powerful component of the Goog - _Category Filtering_: Further refine results with optional category filters - _Comprehensive Lead Filtering_: Apply multiple quality filters simultaneously for precise lead targeting -#### How It Works +#### How it works This module provides the most flexible options for defining where and how to search for business leads: -### Geographic Targeting Options +### Geographic targeting options - _Simple Location Query_: Use natural language location inputs like "New York, USA" - _Structured Location Components_: Build precise locations using country, state, city, or county parameters - _Postal Code Targeting_: Target specific postal/ZIP code areas for direct mail campaigns - _Custom Polygon Areas_: Define exact geographic boundaries using coordinate pairs for ultra-precise targeting -### Search and Filter Capabilities +### Search and filter capabilities - _Keyword-Based Search_: Discover businesses using industry, service, or product terms - _Category-Based Filtering_: Apply Google's category system to refine results diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/index.md b/sources/platform/integrations/workflows-and-notifications/n8n/index.md index eecb6a3641..aeacf2cbd6 100644 --- a/sources/platform/integrations/workflows-and-notifications/n8n/index.md +++ b/sources/platform/integrations/workflows-and-notifications/n8n/index.md @@ -34,7 +34,7 @@ If you're running a self-hosted n8n instance, you can install the Apify communit ![Apify Install Node](../../images/n8n-install-node-self-hosted.png) -## Install the Apify Node (n8n Cloud) +## Install the Apify node (n8n cloud) For n8n Cloud users, installation is even simpler and doesn't require manual package entry. Just search and add the node from the canvas. @@ -82,7 +82,7 @@ For simplicity on n8n Cloud, use the API key method if you prefer manual control With authentication set up, you can now create workflows that incorporate the Apify node. -## Create a Workflow with the Apify Node +## Create a workflow with the Apify node Start by building a basic workflow in n8n, then add the Apify node to handle tasks like running Actors or fetching data. diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md b/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md index 4b79fc7fc9..99e4962e27 100644 --- a/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md +++ b/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md @@ -157,7 +157,7 @@ You can access any of thousands of our scrapers on Apify Store by using the [gen You can select the _Crawler type_ by choosing the rendering engine (browser or HTTP client) and the _Content extraction algorithm_ from multiple HTML transformers. _Element selectors_ allow you to specify which elements to keep, remove, or click, while _URL patterns_ let you define inclusion and exclusion rules with glob syntax. You can also set _Crawling parameters_ like concurrency, depth, timeouts, and retries. For robust crawling, you can configure _Proxy configuration_ settings and select from various _Output options_ for content formats and storage. -## Usage as an AI Agent Tool +## Usage as an AI agent tool You can setup Apify's Scraper for AI Crawling node as a tool for your AI Agents. diff --git a/sources/platform/integrations/workflows-and-notifications/slack.md b/sources/platform/integrations/workflows-and-notifications/slack.md index ed46810a76..2bd47b0167 100644 --- a/sources/platform/integrations/workflows-and-notifications/slack.md +++ b/sources/platform/integrations/workflows-and-notifications/slack.md @@ -24,7 +24,7 @@ To use the Apify integration for Slack, you will need: - An [Apify account](https://console.apify.com/). - A Slack account (and workspace). -## Step 1: Set up the integration for Slack +## Step 1: set up the integration for slack You can find all integrations on an Actor's or task's **Integrations** tab. For example, you can try using the [Google Shopping Scraper](https://console.apify.com/actors/aLTexEuCetoJNL9bL). @@ -42,7 +42,7 @@ Once you are done, click the **Save** button. Click the **Start** button and head to the Slack channel you selected to see your first Apify integration notifications. -## Step 3: Start your run directly from Slack +## Step 3: start your run directly from slack You can now run the same Actor or task directly from Slack by typing `/apify call [Actor or task ID]` into the Slack message box. diff --git a/sources/platform/integrations/workflows-and-notifications/telegram.md b/sources/platform/integrations/workflows-and-notifications/telegram.md index c69ee337bd..ecf1703405 100644 --- a/sources/platform/integrations/workflows-and-notifications/telegram.md +++ b/sources/platform/integrations/workflows-and-notifications/telegram.md @@ -27,7 +27,7 @@ To use the Apify integration on Zapier, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Zapier account](https://zapier.com/). -### Step 1: Create Zap and find Apify on Zapier +### Step 1: create zap and find Apify on Zapier Once you have your Zapier account ready and you are successfully logged in, you can create your first Zap. diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md index 188c3c697d..3f261a4ef2 100644 --- a/sources/platform/integrations/workflows-and-notifications/windmill.md +++ b/sources/platform/integrations/workflows-and-notifications/windmill.md @@ -28,7 +28,7 @@ The Apify integration provides scripts, flows, and resources that will be availa ![Apify Hub](../images/windmill-install-hub.png) -### Step 1: Import Apify scripts from Windmill Hub +### Step 1: import Apify scripts from Windmill hub You can import Apify integration scripts into your flows from the Windmill Hub, regardless of whether you're using Windmill Cloud or a self-hosted instance. The following components will be available: @@ -60,7 +60,7 @@ You can import Apify integration scripts into your flows from the Windmill Hub, You can provide the token to scripts via a **Windmill Resource**. Create it either in the **Resources** tab or directly from a script. -#### Option A — Create in the Resources tab +#### Option a - create in the resources tab 1. Open **Resources** → **New Resource**. 1. Select `apify_api_key` resource type. @@ -69,7 +69,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith ![Apify Auth](../images/windmill-install-auth-resource-tab.png) -#### Option B — Create/bind from a script +#### Option b - create/bind from a script 1. Open the script in Windmill UI. 1. Add a secret input parameter (e.g., `apify_token`) . @@ -78,7 +78,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith ![Apify Auth](../images/windmill-install-auth-script.png) -#### Option C — OAuth authentication +#### Option C - OAuth authentication :::note Cloud-only feature @@ -104,7 +104,7 @@ Let's create a simple workflow that runs an Actor and fetches its results. 1. In the Windmill UI, click **New Flow**. 1. Give your flow a descriptive name (e.g., "Run Actor and Get Results"). -### Step 2: Add the Run Actor script +### Step 2: add the run Actor script 1. Click **Add Step** and search for "Run Actor". 1. Select the **Run Actor** script. @@ -119,7 +119,7 @@ Let's create a simple workflow that runs an Actor and fetches its results. ![Apify Flow](../images/windmill-flow-run-actor.png) -### Step 3: Add the Get Dataset Items script +### Step 3: add the get dataset items script 1. Add another step and search for "Get Dataset Items". 1. Configure the inputs: @@ -190,7 +190,7 @@ Windmill provides webhook-based triggers that can automatically start workflows ![Apify Webhook](../images/windmill-webhook-test-runs.png) -## Deleting the webhook +## Delete the webhook 1. Fork the **Apify's Delete Webhook** script from the Windmill Hub. 1. Set either your _API Key_ or _OAuth Token_ resource diff --git a/sources/platform/integrations/workflows-and-notifications/workato.md b/sources/platform/integrations/workflows-and-notifications/workato.md index 76d52fe334..3dc3a2cd01 100644 --- a/sources/platform/integrations/workflows-and-notifications/workato.md +++ b/sources/platform/integrations/workflows-and-notifications/workato.md @@ -144,7 +144,7 @@ Each connector trigger and action field in Workato includes inline help text des The Apify connector provides the following triggers that monitor your Apify account for task completions: -### Actor Run Finished +### Actor run finished _Triggers when an Apify Actor run finishes (succeeds, fails, times out, or gets aborted)._ @@ -156,7 +156,7 @@ This trigger monitors a specific Apify Actor and starts the recipe when any run ![Screenshot of the Actor Run Finished trigger configuration in Workato](../images/workato/trigger-actor.png) -### Task Run Finished +### Task run finished _Triggers when an Apify Task run finishes (succeeds, fails, times out, or gets aborted)._ @@ -205,7 +205,7 @@ This action runs an Apify Task with optional input overrides and execution param ![Screenshot of the Run Task action configuration interface in Workato](../images/workato/run-task.png) -### Get Dataset Items +### Get dataset items _Retrieves items from a dataset with dynamic field mapping._ @@ -215,7 +215,7 @@ Select a dataset to dynamically generate output fields and retrieve its items. T - Retrieves data records from specified datasets with pagination support - Returns structured data ready for downstream recipe steps -#### Dynamic Schema Detection +#### Dynamic schema detection The connector samples your dataset to create appropriate output fields: @@ -231,7 +231,7 @@ For optimal results, use datasets where all items follow a consistent structure. ![Screenshot of the Get Dataset Items action configuration interface in Workato](../images/workato/get-dataset.png) -### Get Key-value store Record +### Get key-value store record _Retrieves a single record from a Key-value store._ diff --git a/sources/platform/integrations/workflows-and-notifications/zapier.md b/sources/platform/integrations/workflows-and-notifications/zapier.md index b72c572746..85698d68f1 100644 --- a/sources/platform/integrations/workflows-and-notifications/zapier.md +++ b/sources/platform/integrations/workflows-and-notifications/zapier.md @@ -23,7 +23,7 @@ To use the Apify integration on Zapier, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Zapier account](https://zapier.com/). -### Step 1: Create Zap and find Apify on Zapier +### Step 1: create zap and find Apify on Zapier Once you have your Zapier account ready and you are successfully logged in, you can create your first Zap. @@ -98,7 +98,7 @@ Once you are happy with the test, you can publish the Zap. When it is turned on, > Triggers when a selected Actor run is finished. -### Finished Task Run +### Finished task run > Triggers when a selected Actor task run is finished. @@ -123,15 +123,15 @@ Once you are happy with the test, you can publish the Zap. When it is turned on, ## Searches -### Fetch Dataset Items +### Fetch dataset items > Retrieves items from a [dataset](/platform/storage/dataset). -### Find Last Actor Run +### Find last Actor run > Finds the most recent Actor run. -### Find Last Task Run +### Find last task run > Finds the most recent Actor task run. diff --git a/sources/platform/proxy/datacenter_proxy.md b/sources/platform/proxy/datacenter_proxy.md index 1cf81cb789..9ed2300b93 100644 --- a/sources/platform/proxy/datacenter_proxy.md +++ b/sources/platform/proxy/datacenter_proxy.md @@ -47,7 +47,7 @@ This feature is also useful if you have your own pool of proxy servers and still Prices for dedicated proxy servers are mainly based on the number of proxy servers, their type, and location. [Contact us](https://apify.com/contact) for more information. -## Connecting to datacenter proxies +## Connect to datacenter proxies By default, each proxied HTTP request is potentially sent via a different target proxy server, which adds overhead and could be potentially problematic for websites which save cookies based on IP address. diff --git a/sources/platform/proxy/google_serp_proxy.md b/sources/platform/proxy/google_serp_proxy.md index ddc39a0f7b..438bc06ba4 100644 --- a/sources/platform/proxy/google_serp_proxy.md +++ b/sources/platform/proxy/google_serp_proxy.md @@ -24,7 +24,7 @@ Our Google SERP proxy currently supports the below services. When using the proxy, **pricing is based on the number of requests made**. -## Connecting to Google SERP proxy +## Connect to Google SERP proxy Requests made through the proxy are automatically routed through a proxy server from the selected country and pure **HTML code of the search result page is returned**. @@ -61,7 +61,7 @@ See a [full list](https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXW ## Examples -### Using the Apify SDK +### Use the Apify SDK If you are developing your own Apify [Actor](../actors/index.mdx) using the [Apify SDK](/sdk) and [Crawlee](https://crawlee.dev/), the most efficient way to use Google SERP proxy is [CheerioCrawler](https://crawlee.dev/api/cheerio-crawler/class/CheerioCrawler). This is because Google SERP proxy [only returns a page's HTML](./index.md). Alternatively, you can use the [got-scraping](https://github.com/apify/got-scraping) [npm package](https://www.npmjs.com/package/got-scraping) by specifying the proxy URL in the options. For Python, you can leverage the [`requests`](https://pypi.org/project/requests/) library along with the Apify SDK. @@ -145,7 +145,7 @@ await Actor.exit(); -### Using standard libraries and languages +### Use standard libraries and languages You can find your proxy password on the [Proxy page](https://console.apify.com/proxy/access) of Apify Console. diff --git a/sources/platform/proxy/residential_proxy.md b/sources/platform/proxy/residential_proxy.md index f65470e661..77fac37bdc 100644 --- a/sources/platform/proxy/residential_proxy.md +++ b/sources/platform/proxy/residential_proxy.md @@ -20,7 +20,7 @@ Residential proxies support [IP address rotation](./usage.md#ip-address-rotation **Pricing is based on data traffic**. It is measured for each connection made and displayed on your [proxy usage dashboard](https://console.apify.com/proxy/usage) in the Apify Console. -## Connecting to residential proxy +## Connect to residential proxy Connecting to residential proxy works the same way as [datacenter proxy](./datacenter_proxy.md), with two differences. diff --git a/sources/platform/proxy/usage.md b/sources/platform/proxy/usage.md index 0203aeb2f4..6aca8e5d1f 100644 --- a/sources/platform/proxy/usage.md +++ b/sources/platform/proxy/usage.md @@ -19,7 +19,7 @@ The full connection string has the following format: http://:@: ``` -:::caution +:::caution Password security All usage of Apify Proxy with your password is charged towards your account. Do not share the password with untrusted parties or use it from insecure networks, as **the password is sent unencrypted** due to the HTTP protocol's [limitations](https://www.guru99.com/difference-http-vs-https.html). ::: @@ -35,7 +35,7 @@ If you need to test Apify Proxy before you subscribe, please [contact our suppor | Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username.| | Password | Apify Proxy password. Your password is displayed on the [Proxy](https://console.apify.com/proxy/groups) page in Apify Console.
**Note**: this is not your Apify account password. | -:::caution +:::caution External connections If you use these connection parameters for connecting to Apify Proxy from your Actors running on the Apify Platform, the connection will still be considered external, it will not work on the Free plan, and on paid plans you will be charged for external data transfer. Please use the connection parameters from the [Connection from Actors](#connection-from-actors) section when using Apify Proxy from Actors. ::: @@ -138,8 +138,8 @@ Web scrapers can rotate the IP addresses they use to access websites. They assig Depending on whether you use a [browser](https://apify.com/apify/web-scraper) or [HTTP requests](https://apify.com/apify/cheerio-scraper) for your scraping jobs, IP address rotation works differently. -* Browser—a different IP address is used for each browser. -* HTTP request—a different IP address is used for each request. +* Browser - a different IP address is used for each browser. +* HTTP request - a different IP address is used for each request. Use [sessions](#sessions) to control how you rotate IP addresses. See our guide [Anti-scraping techniques](/academy/anti-scraping/techniques) to learn more about IP address rotation and our findings on how blocking works. @@ -175,7 +175,7 @@ To test that your requests are proxied and IP addresses are being [rotated](/aca https://api.apify.com/v2/browser-info/ -### A different approach to `502 Bad Gateway` +### A different approach to `502 bad gateway` Sometimes when the `502` status code is not comprehensive enough. Therefore, we have modified our server with `590-599` codes instead to provide more insight: diff --git a/sources/platform/schedules.md b/sources/platform/schedules.md index ef13cc0048..be9b54e272 100644 --- a/sources/platform/schedules.md +++ b/sources/platform/schedules.md @@ -32,7 +32,7 @@ However, runs can be delayed because of a system overload or a server shutting d Each schedule can be associated with a maximum of _10_ Actors and _10_ Actor tasks. -## Setting up a new schedule +## Set up a new schedule Before setting up a new schedule, you should have the [Actor](./actors/index.mdx) or [task](./actors/running/tasks.md) you want to schedule prepared and tested. diff --git a/sources/platform/security.md b/sources/platform/security.md index ec13e53011..8159647805 100644 --- a/sources/platform/security.md +++ b/sources/platform/security.md @@ -72,7 +72,7 @@ _We are especially interested in reports that demonstrate:_ - Authentication/authorization issues - Data leaks due to misconfiguration -### Reporting process +### Report process If you notice or suspect a potential security issue, please report it to our security team at [security@apify.com](mailto:security@apify.com) with as much detail as possible, including the following: @@ -109,6 +109,6 @@ Please adhere strictly to the following rules. Failure to do so may result in le ::: -## Securing your data +## Secure your data The Apify platform provides you with multiple ways to secure your data, including [encrypted environment variables](./actors/development/programming_interface/environment_variables.md) for storing your configuration secrets and [encrypted input](./actors/development/actor_definition/input_schema/secret_input.md) for securing the input parameters of your Actors. diff --git a/sources/platform/storage/dataset.md b/sources/platform/storage/dataset.md index b56d6008d9..5bebec9207 100644 --- a/sources/platform/storage/dataset.md +++ b/sources/platform/storage/dataset.md @@ -378,7 +378,7 @@ This feature is also useful when customizing your RSS feeds generated for variou By default, the whole result is wrapped in an `` element, while each page object is contained in an `` element. You can change this using the `xmlRoot` and `xmlRow` URL parameters when retrieving your data with a GET request. -## Sharing +## Share You can grant [access rights](../collaboration/index.md) to your dataset through the **Share** button under the **Actions** menu. For more details, check the [full list of permissions](../collaboration/list_of_permissions.md). @@ -386,7 +386,7 @@ You can also share datasets by link using their ID or name, depending on your ac For one-off sharing of specific records when access is restricted, you can generate time-limited pre-signed URLs. See [Sharing restricted resources with pre-signed URLs](/platform/collaboration/general-resource-access#pre-signed-urls). -### Sharing datasets between runs +### Share datasets between runs You can access a dataset from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run as long as you know its _name_ or _ID_. diff --git a/sources/platform/storage/key_value_store.md b/sources/platform/storage/key_value_store.md index 8d4bb85d6c..57c790b4e2 100644 --- a/sources/platform/storage/key_value_store.md +++ b/sources/platform/storage/key_value_store.md @@ -265,7 +265,7 @@ You can compress a record and use the [Content-Encoding request header](https:// _Using the [JavaScript SDK](/sdk/js/reference/class/KeyValueStore#setValue) or our [JavaScript API client](/api/client/js/reference/class/KeyValueStoreClient#setRecord) automatically compresses your files._ We advise utilizing the JavaScript API client for data compression prior to server upload and decompression upon retrieval, minimizing storage costs. -## Sharing +## Share You can grant [access rights](../collaboration/index.md) to your key-value store through the **Share** button under the **Actions** menu. For more details check the [full list of permissions](../collaboration/list_of_permissions.md). @@ -273,7 +273,7 @@ You can also share key-value stores by link using their ID or name, depending on For one-off sharing of specific records when access is restricted, you can generate time-limited pre-signed URLs. See [Sharing restricted resources with pre-signed URLs](/platform/collaboration/general-resource-access#pre-signed-urls). -### Sharing key-value stores between runs +### Share key-value stores between runs You can access a key-value store from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run as long as you know its _name_ or _ID_. diff --git a/sources/platform/storage/request_queue.md b/sources/platform/storage/request_queue.md index 32c0fa87b4..da7baecdcc 100644 --- a/sources/platform/storage/request_queue.md +++ b/sources/platform/storage/request_queue.md @@ -409,7 +409,7 @@ If the Actor processing the request fails, the lock expires, and the request is In the following example, we demonstrate how you can use locking mechanisms to avoid concurrent processing of the same request across multiple Actor runs. -:::info +:::info Lock mechanism The lock mechanism works on the client level, as well as the run level, when running the Actor on the Apify platform. This means you can unlock or prolong the lock the locked request only if: @@ -554,7 +554,7 @@ await Actor.exit(); A detailed tutorial on how to process one request queue with multiple Actor runs can be found in [Academy tutorials](https://docs.apify.com/academy/node-js/multiple-runs-scrape). -## Sharing +## Share You can grant [access rights](../collaboration/index.md) to your request queue through the **Share** button under the **Actions** menu. For more details check the [full list of permissions](../collaboration/list_of_permissions.md). @@ -562,7 +562,7 @@ You can also share request queues by link using their ID or name, depending on y For one-off sharing of specific records when access is restricted, you can generate time-limited pre-signed URLs. See [Sharing restricted resources with pre-signed URLs](/platform/collaboration/general-resource-access#pre-signed-urls). -### Sharing request queues between runs +### Share request queues between runs You can access a request queue from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run as long as you know its _name_ or _ID_. diff --git a/sources/platform/storage/usage.md b/sources/platform/storage/usage.md index ac55550620..980399cfa3 100644 --- a/sources/platform/storage/usage.md +++ b/sources/platform/storage/usage.md @@ -55,7 +55,7 @@ Additionally, you can quickly share the contents and details of your storage by ![Storage API](./images/overview-api.png) -These URLs link to API _endpoints_—the places where your data is stored. Endpoints that allow you to _read_ stored information do not require an [authentication token](/api/v2#authentication). Calls are authenticated using a hard-to-guess ID, allowing for secure sharing. However, operations such as _update_ or _delete_ require the authentication token. +These URLs link to API _endpoints_ - the places where your data is stored. Endpoints that allow you to _read_ stored information do not require an [authentication token](/api/v2#authentication). Calls are authenticated using a hard-to-guess ID, allowing for secure sharing. However, operations such as _update_ or _delete_ require the authentication token. > Never share a URL containing your authentication token, to avoid compromising your account's security.
@@ -77,11 +77,11 @@ With other request types and when using the `username~store-name`, however, you For further details and a breakdown of each storage API endpoint, refer to the [API documentation](/api/v2/storage-datasets). -### Apify API Clients +### Apify API clients -The Apify API Clients allow you to access your datasets from any Node.js or Python application, whether it's running on the Apify platform or externally. +The Apify API clients allow you to access your datasets from any Node.js or Python application, whether it's running on the Apify platform or externally. -You can visit [API Clients](/api) documentations for more information. +You can visit [API clients](/api) documentations for more information. ### Apify SDKs @@ -133,7 +133,7 @@ Go to the [API documentation](/api/v2#rate-limiting) for details and to learn wh Apify securely stores your ten most recent runs indefinitely, ensuring your records are always accessible. Unnamed datasets and runs beyond the latest ten will be automatically deleted after 7 days unless otherwise specified. Named datasets are retained indefinitely. -### Preserving your storages +### Preserve your storages To ensure indefinite retention of your storages, assign them a name. This can be done via Apify Console or through our API. First, you'll need your store's ID. You can find it in the details of the run that created it. In Apify Console, head over to your run's details and select the **Dataset**, **Key-value store**, or **Request queue** tab as appropriate. Check that store's details, and you will find its ID among them. @@ -158,7 +158,7 @@ Named and unnamed storages are identical in all aspects except for their retenti For example, storage names `janedoe~my-storage-1` and `janedoe~web-scrape-results` are easier to tell apart than the alphanumerical IDs `cAbcYOfuXemTPwnIB` and `CAbcsuZbp7JHzkw1B`. -## Sharing +## Share You can grant [access rights](../collaboration/index.md) to other Apify users to view or modify your storages. Check the [full list of permissions](../collaboration/list_of_permissions.md). @@ -172,7 +172,7 @@ If your storage resource is set to _restricted_, all API calls must include a va ::: -### Sharing storages between runs +### Share storages between runs Storage can be accessed from any [Actor](../actors/index.mdx) or [task](../actors/running/tasks.md) run, provided you have its _name_ or _ID_. You can access and manage storages from other runs using the same methods or endpoints as with storages from your current run. @@ -190,7 +190,7 @@ Learn how restricted access works in [General resource access](/platform/collabo ::: -## Deleting storages +## Delete storages Named storages are only removed upon your request.
You can delete storages in the following ways: From d9f3e25763e6de2cf6fe4d979002b61c62846f96 Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 08:53:34 +0000 Subject: [PATCH 2/9] docs: fix Actor capitalization and Step/Option format Fix remaining capitalization errors from previous commit: - Capitalize "Actor" and "Actors" in all headings (proper product name) - Capitalize "Standby mode" as proper feature name - Capitalize "I" in "How do I" questions - Capitalize text after "Step X:" and "Option X:" in headings These follow the Apify terminology guide where Actor is always capitalized as it's a proper product name, and sentence case rules require capitalizing the first word after colons in step/option headings. Co-Authored-By: Claude Sonnet 4.5 --- .../deployment/continuous_integration.md | 4 +- .../development/quick-start/start_locally.md | 2 +- .../platform/actors/running/actor_standby.md | 8 +-- sources/platform/actors/running/store.md | 60 +++++++++---------- .../integrations/data-storage/keboola.md | 6 +- .../workflows-and-notifications/make/index.md | 12 ++-- .../workflows-and-notifications/slack.md | 4 +- .../workflows-and-notifications/telegram.md | 4 +- .../workflows-and-notifications/windmill.md | 12 ++-- .../workflows-and-notifications/zapier.md | 2 +- 10 files changed, 57 insertions(+), 57 deletions(-) diff --git a/sources/platform/actors/development/deployment/continuous_integration.md b/sources/platform/actors/development/deployment/continuous_integration.md index e951180c79..b642fbcb4d 100644 --- a/sources/platform/actors/development/deployment/continuous_integration.md +++ b/sources/platform/actors/development/deployment/continuous_integration.md @@ -32,7 +32,7 @@ Set up continuous integration for your Actors using one of these methods: Choose the method that best fits your workflow. -## Option 1: trigger builds with a webhook +## Option 1: Trigger builds with a webhook 1. Push your Actor to a GitHub repository. 1. Go to your Actor's detail page in Apify Console, click on the API tab in the top right, then select API Endpoints. Copy the **Build Actor** API endpoint URL. The format is as follows: @@ -54,7 +54,7 @@ Choose the method that best fits your workflow. Now your Actor will automatically rebuild on every push to the GitHub repository. -## Option 2: set up automated builds and tests with GitHub actions +## Option 2: Set up automated builds and tests with GitHub actions 1. Push your Actor to a GitHub repository. 1. Get your Apify API token from the [Apify Console](https://console.apify.com/settings/integrations) diff --git a/sources/platform/actors/development/quick-start/start_locally.md b/sources/platform/actors/development/quick-start/start_locally.md index 406d39f064..52a0f7ffd9 100644 --- a/sources/platform/actors/development/quick-start/start_locally.md +++ b/sources/platform/actors/development/quick-start/start_locally.md @@ -143,7 +143,7 @@ Let's now deploy your Actor to the Apify platform, where you can run the Actor o apify push ``` -### Step 5: it's time to iterate! +### Step 5: It's time to iterate! Good job! 🎉 You're ready to develop your Actor. You can make changes to your Actor and implement your use case. diff --git a/sources/platform/actors/running/actor_standby.md b/sources/platform/actors/running/actor_standby.md index a7ebc58970..b2c171b477 100644 --- a/sources/platform/actors/running/actor_standby.md +++ b/sources/platform/actors/running/actor_standby.md @@ -14,7 +14,7 @@ Traditional Actors are designed to run a single job and then stop. They're mostl However, in some applications, waiting for an Actor to start is not an option. Actor Standby mode solves this problem by letting you have the Actor ready in the background, waiting for the incoming HTTP requests. In a sense, the Actor behaves like a real-time web server or standard API server. -## How do i know if standby mode is enabled +## How do I know if Standby mode is enabled You will know that the Actor is enabled for Standby mode if you see the **Standby** tab on the Actor's detail page. In the tab, you will find the hostname of the server, the description of the Actor's endpoints, @@ -25,7 +25,7 @@ hit the API endpoint and get results. ![Standby tab](./images/actor_standby/standby-tab.png) -## How do i pass input to Actors in standby mode +## How do I pass input to Actors in Standby mode If you're using an Actor built by someone else, see its Information tab to find out how the input should be passed. @@ -75,7 +75,7 @@ For requests sent to an Actor in Standby mode, the maximum time allowed until re The rate limit for incoming requests to a Standby Actor is _2000 requests per second_ per user account. -## How do i customize standby configuration +## How do I customize Standby configuration The Standby configuration currently consists of the following properties: @@ -99,6 +99,6 @@ However, running Actors in Standby mode might have unexpected costs, as the Acto No, even if you use the Actor-level hostname with the default configuration, the background Actor runs for your requests are not shared with other users. -## How can i develop Actors using standby mode +## How can I develop Actors using Standby mode See the [Actor Standby development section](../development/programming_interface/actor_standby.md). diff --git a/sources/platform/actors/running/store.md b/sources/platform/actors/running/store.md index 359f1a3ffd..0d1b96d885 100644 --- a/sources/platform/actors/running/store.md +++ b/sources/platform/actors/running/store.md @@ -28,9 +28,9 @@ All Actors in [Apify Store](https://apify.com/store) fall into one of the four p 3. [**Pay per event**](#pay-per-event) - you pay for specific events the Actor creator defines, such as generating a single result or starting the Actor. Most Actors include platform usage in the price, but some may charge it separately — check the Actor's pricing for details. 4. [**Pay per usage**](#pay-per-usage) - you can run the Actor and you pay for the platform usage the Actor generates. -### Rental actors +### Rental Actors -Rental actors are Actors for which you have to pay a recurring fee to the developer after your trial period ends. This empowers the developer to dedicate more time and effort to their Actors, thus ensuring they are of the _highest quality_ and receive _ongoing maintenance_. +Rental Actors are Actors for which you have to pay a recurring fee to the developer after your trial period ends. This empowers the developer to dedicate more time and effort to their Actors, thus ensuring they are of the _highest quality_ and receive _ongoing maintenance_. Most rental Actors have a _free trial_ period. The length of the trial is displayed on each Actor's page. @@ -39,28 +39,28 @@ Most rental Actors have a _free trial_ period. The length of the trial is displa After a trial period, a flat monthly _Actor rental_ fee is automatically subtracted from your prepaid platform usage in advance for the following month. Most of this fee goes directly to the developer and is paid on top of the platform usage generated by the Actor. You can read more about our motivation for releasing rental Actors in [this blog post](https://blog.apify.com/make-regular-passive-income-developing-web-automation-actors-b0392278d085/) from Apify's CEO Jan Čurn. -#### Rental actors - Frequently asked questions +#### Rental Actors - Frequently asked questions -##### Can I run rental actors via API or the Apify client? +##### Can I run rental Actors via API or the Apify client? Yes, when you are renting an Actor, you can run it using either our [API](/api/v2), [JavaScript](/api/client/js) or [Python](/api/client/python) clients as you would do with private or free public Actors. -##### Do I pay platform costs for running rental actors? +##### Do I pay platform costs for running rental Actors? [//]: # (TODO better link for platform usage costs explaining what it is!) Yes, you will pay normal [platform usage costs](https://apify.com/pricing) on top of the monthly Actor rental fee. The platform costs work exactly the same way as for free public Actors or your private Actors. You should find estimates of the cost of usage in each individual rental Actor's README ([see an example](https://apify.com/compass/crawler-google-places#how-much-will-it-cost)). -##### Do I need an Apify paid plan to use rental actors? +##### Do I need an Apify paid plan to use rental Actors? You don't need a paid plan to start a rental Actor's free trial. Just activate the trial, and you are good to go. After that, you will need to subscribe to one of [Apify's paid plans](https://apify.com/pricing) in order to keep renting the Actor and continue using it. -##### When will I be charged for the actor rental? +##### When will I be charged for the Actor rental? You always prepay the Actor rental for the following month. The first payment happens when the trial expires, and then recurs monthly. When you open the Actor in the Apify Console, you will see when the next rental payment is due, and you will also receive a notification when it happens. _Example_: You activate a 7-day trial of an Actor at _noon of April 1, 2021_. If you don't turn off auto-renewal, you will be charged at _noon on April 8, 2021_, then _May 8, 2021_. -##### How am I charged for actor rental? +##### How am I charged for Actor rental? The rental fee for an Actor is automatically subtracted from your prepaid platform usage, similarly to, e.g. [compute units](./usage_and_resources.md). If you don't have enough usage prepaid, you will need to cover any overage in the next invoice. @@ -68,17 +68,17 @@ The rental fee for an Actor is automatically subtracted from your prepaid platfo If you have an [Apify paid plan](https://apify.com/pricing), the monthly rental fee will be automatically subtracted from your plan's prepaid usage at the end of your free trial, and you will be able to run the Actor for another month. If you are not subscribed to any of [Apify's paid plans](https://apify.com/pricing), you will need to subscribe to one in order to continue using the Actor after the trial has ended. -##### Can I cancel my actor rental? +##### Can I cancel my Actor rental? _You can cancel the Actor rental_ during your trial or any time after that so you don't get charged when your current Actor rental period expires. You can always turn it back on later if you want. -##### Where can I see how much I have paid for actor rental? +##### Where can I see how much I have paid for Actor rental? Since Actor rental fees are paid from prepaid platform usage, these fees conceptually belong under platform usage. -You can find the breakdown of how much you have been charged for rental actors in the **Actors** tab, which you will find within the **Current period** tab in the [Billing](https://console.apify.com/billing) section. +You can find the breakdown of how much you have been charged for rental Actors in the **Actors** tab, which you will find within the **Current period** tab in the [Billing](https://console.apify.com/billing) section. -![Rental actors billing in Apify Console](./images/store/billing-paid-actors.png) +![Rental Actors billing in Apify Console](./images/store/billing-paid-actors.png) ### Pay per result @@ -92,15 +92,15 @@ This makes it transparent and easy to estimate upfront costs. If you have any fe -#### Pay per result actors - Frequently asked questions +#### Pay per result Actors - Frequently asked questions -##### How do I know an actor is paid per result? +##### How do I know an Actor is paid per result? When you try the Actor on the platform, you will see that the Actor is paid per result next to the Actor name. ![Actor paid per result in Console](./images/store/console_pay_per_result_tag.png) -##### Do I need to pay a monthly rental fee to run the actor? +##### Do I need to pay a monthly rental fee to run the Actor? No, the Actor is free to run. You only pay for the results. @@ -112,17 +112,17 @@ Under the **pay per result** model, all platform costs generated _during the run You will still be charged for the timed storage of the data in the same fashion as with any other Actor. You can always decide to delete the dataset to reduce your costs after you export the data from the platform. By default, any unnamed dataset will be automatically removed after your data retention period, so usually, this is nothing to worry about. -##### Can I set a cap on how many results an actor should return? +##### Can I set a cap on how many results an Actor should return? You can set a limit on how many items an Actor should return and the amount you will be charged in Options on the Actor detail page in the section below the Actor input. ![Max items for pay-per-result](./images/store/max-items-for-pay-per-result.png) -##### Can I publish an actor that is paid per result? +##### Can I publish an Actor that is paid per result? Yes, you can publish an Actor that is paid per result. -##### Where do I see how much I was charged for the pay per result actors? +##### Where do I see how much I was charged for the pay per result Actors? You can see the overview of how much you have been charged for Actors paid by result on your invoices and in the [Usage tab](https://console.apify.com/billing) of the Billing section in Console. It will be shown there as a separate service. @@ -149,9 +149,9 @@ Most pay per event Actors include platform usage in the event price. However, so ::: -#### Pay per event actors - Frequently asked questions +#### Pay per event Actors - Frequently asked questions -#### How do I know actor is paid per events? +#### How do I know an Actor is paid per events? You will see that the Actor is paid per events next to the Actor name. @@ -159,25 +159,25 @@ You will see that the Actor is paid per events next to the Actor name. ![Example pay per event Actor](./images/store/pay_per_event_example_actor.png) -#### Do I need to pay a monthly rental fee to run the actor? +#### Do I need to pay a monthly rental fee to run the Actor? No, you only pay for the events. #### What happens when I interact with the dataset after the run finishes? -You would still pay for all interactions after the Actor run finishes, same as for pay per result actors. +You would still pay for all interactions after the Actor run finishes, same as for pay per result Actors. #### Do I pay for the storage of results on the Apify platform? -You would still pay for the long term storage of results, same as for pay per result actors. +You would still pay for the long term storage of results, same as for pay per result Actors. -#### Do I need to pay for platform usage with pay per event actors? +#### Do I need to pay for platform usage with pay per event Actors? -In most cases, no - the majority of pay per event actors include [platform usage](./usage_and_resources.md) in the event price, so you only pay for the events. However, some Actors may charge platform usage separately, in addition to the event costs. Always check the pricing section on the Actor's page - it clearly states whether platform usage is included or not. +In most cases, no - the majority of pay per event Actors include [platform usage](./usage_and_resources.md) in the event price, so you only pay for the events. However, some Actors may charge platform usage separately, in addition to the event costs. Always check the pricing section on the Actor's page - it clearly states whether platform usage is included or not. ![Pay per event with usage not included in Apify Store](./images/store/pay_per_event_and_usage_example_actor.png) -#### Where do i see how much i was charged for the pay per event actors? +#### Where do I see how much I was charged for the pay per event Actors? You can see how much you have been charged on your invoices, and on the [Usage tab](https://console.apify.com/billing) of the Billing section in the Console. @@ -187,13 +187,13 @@ You can also see the cost of each run on the run detail itself. ![Pay per event actor - run detail](./images/store/pay_per_event_price_on_run_detail.png) -#### Can I put a cap on a cost of a single actor run? +#### Can I put a cap on a cost of a single Actor run? Yes, when starting an Actor run, you can define the maximum limit on the cost of that run. When the Actor reaches the defined limit, it should terminate gracefully. Even if it didn't, for any reason, and kept producing results, we make always sure you are never charged more that your defined limit. ![Pay per event actor - max charge per run](./images/store/pay_per_event_price_on_run_detail.png) -#### How do I raise a dispute if the charges for an actor seem off? +#### How do I raise a dispute if the charges for an Actor seem off? Please, in such a case, do not hesitate to contact the Actor author or our support team. If you suspect a bug in the Actor, you can also always create an issue on the Actor detail in the Apify Console. @@ -211,13 +211,13 @@ _For more information on platform usage cost see the [usage and resources](./usa ::: -## Report issues with actors +## Report issues with Actors Each Actor has an **Issues** tab in Apify Console. There, you can open an issue (ticket) and chat with the Actor's author, platform admins, and other users of this Actor. Please feel free to use the tab to ask any questions, request new features, or give feedback. Alternatively, you can always write to [community@apify.com](mailto:community@apify.com). -![Paid actors' issues tab](./images/store/paid-actors-issues-tab.png) +![Paid Actors' issues tab](./images/store/paid-actors-issues-tab.png) ## Apify Store discounts diff --git a/sources/platform/integrations/data-storage/keboola.md b/sources/platform/integrations/data-storage/keboola.md index fbedc427a9..a09671a0dd 100644 --- a/sources/platform/integrations/data-storage/keboola.md +++ b/sources/platform/integrations/data-storage/keboola.md @@ -21,7 +21,7 @@ To use the Apify integration on Keboola, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Keboola account](https://www.keboola.com/). -### Step 1: create a new data source in Keboola +### Step 1: Create a new data source in Keboola Once your Keboola account is ready and you are logged in, navigate to the **Components** section in the top menu and click the **Add Component** button. @@ -35,7 +35,7 @@ Provide a name and description for your configuration, then click the **Create C ![Keboola configuration setup](../images/keboola/keboola-create-configuration.png) -### Step 2: configure the Apify data source +### Step 2: Configure the Apify data source With the new configuration created, you can now configure the data source to retrieve the needed data. Click on the **Configure Component** button to begin the setup process. @@ -75,7 +75,7 @@ Once you have filled in all the necessary options, click the **Save** button to ![Keboola component specification setup](../images/keboola/keboola-setup-specification.png) -### Step 3: run the configured data source +### Step 3: Run the configured data source After your data source has been configured, you can run it by clicking the **Run** button in the upper-right corner of your configuration. diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md index 1a8bfd8e00..952886ef54 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/index.md +++ b/sources/platform/integrations/workflows-and-notifications/make/index.md @@ -65,7 +65,7 @@ The primary difference between the two methods is that the synchronous run waits In this example, we will demonstrate how to run an Actor synchronously and export the output to Google Sheets. The same principle applies to module that runs a task. -#### Step 1: add the Apify "Run an actor" module +#### Step 1: Add the Apify "Run an actor" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Run an Actor" to your scenario and configure it. @@ -75,7 +75,7 @@ Make sure to set the "Run synchronously" option to "Yes," so the module waits fo ![make-com-sync-2.png](../../images/make-com/make-com-sync-2.png) -#### Step 2: add the Apify "Get dataset items" module +#### Step 2: Add the Apify "Get dataset items" module In the next step, add the "Get Dataset Items" module to your scenario, which is responsible for retrieving the output data from the Actor run. @@ -84,7 +84,7 @@ You can find this dataset ID in the variables generated by the previous "Run an ![make-com-sync-3.png](../../images/make-com/make-com-sync-3.png) -#### Step 3: add the Google sheets "Create spreadsheet rows" module +#### Step 3: Add the Google sheets "Create spreadsheet rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario. This module will automatically create new rows in a Google Sheets file to store the Actor's output. @@ -99,7 +99,7 @@ You’re all set! Once the scenario is started, it will run the Actor synchronou In this example, we will demonstrate how to run an Actor asynchronously and export its output to Google Sheets. Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify console, on a schedule, or from a separate Make.com scenario. -#### Step 1: add the Apify "Watch Actor runs" module +#### Step 1: Add the Apify "Watch Actor runs" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Watch Actor Runs" to your scenario. This module will set up a webhook to listen for the finished runs of the selected Actor. @@ -108,7 +108,7 @@ For this example, we will use the "Google Maps Review Scraper" Actor. ![make-com-async-1.png](../../images/make-com/make-com-async-1.png) -#### Step 2: add the Apify "Get dataset items" module +#### Step 2: Add the Apify "Get dataset items" module Add the "Get Dataset Items" module to your scenario to retrieve the output of the Actor run. @@ -116,7 +116,7 @@ In the "Dataset ID" field, provide the default dataset ID from the Actor run. Yo ![make-com-async-2.png](../../images/make-com/make-com-async-2.png) -#### Step 3: add the Google sheets "Create spreadsheet rows" module +#### Step 3: Add the Google sheets "Create spreadsheet rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario, which will create new rows in the specified Google Sheets file to store the Actor's output. diff --git a/sources/platform/integrations/workflows-and-notifications/slack.md b/sources/platform/integrations/workflows-and-notifications/slack.md index 2bd47b0167..97e986c654 100644 --- a/sources/platform/integrations/workflows-and-notifications/slack.md +++ b/sources/platform/integrations/workflows-and-notifications/slack.md @@ -24,7 +24,7 @@ To use the Apify integration for Slack, you will need: - An [Apify account](https://console.apify.com/). - A Slack account (and workspace). -## Step 1: set up the integration for slack +## Step 1: Set up the integration for slack You can find all integrations on an Actor's or task's **Integrations** tab. For example, you can try using the [Google Shopping Scraper](https://console.apify.com/actors/aLTexEuCetoJNL9bL). @@ -42,7 +42,7 @@ Once you are done, click the **Save** button. Click the **Start** button and head to the Slack channel you selected to see your first Apify integration notifications. -## Step 3: start your run directly from slack +## Step 3: Start your run directly from slack You can now run the same Actor or task directly from Slack by typing `/apify call [Actor or task ID]` into the Slack message box. diff --git a/sources/platform/integrations/workflows-and-notifications/telegram.md b/sources/platform/integrations/workflows-and-notifications/telegram.md index ecf1703405..172a9b473e 100644 --- a/sources/platform/integrations/workflows-and-notifications/telegram.md +++ b/sources/platform/integrations/workflows-and-notifications/telegram.md @@ -27,7 +27,7 @@ To use the Apify integration on Zapier, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Zapier account](https://zapier.com/). -### Step 1: create zap and find Apify on Zapier +### Step 1: Create zap and find Apify on Zapier Once you have your Zapier account ready and you are successfully logged in, you can create your first Zap. @@ -73,7 +73,7 @@ The connection is now created and the configuration form closed. ## Connect Telegram bot with Zapier -### Step 1: Create & connect new bot on Telegram +### Step 1: Create and connect new bot on Telegram After setting up Apify as your trigger within Zapier, it's time to set up Telegram as the action that will occur based on the trigger. diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md index 3f261a4ef2..2da5fc9ab0 100644 --- a/sources/platform/integrations/workflows-and-notifications/windmill.md +++ b/sources/platform/integrations/workflows-and-notifications/windmill.md @@ -28,7 +28,7 @@ The Apify integration provides scripts, flows, and resources that will be availa ![Apify Hub](../images/windmill-install-hub.png) -### Step 1: import Apify scripts from Windmill hub +### Step 1: Import Apify scripts from Windmill hub You can import Apify integration scripts into your flows from the Windmill Hub, regardless of whether you're using Windmill Cloud or a self-hosted instance. The following components will be available: @@ -60,7 +60,7 @@ You can import Apify integration scripts into your flows from the Windmill Hub, You can provide the token to scripts via a **Windmill Resource**. Create it either in the **Resources** tab or directly from a script. -#### Option a - create in the resources tab +#### Option a - Create in the resources tab 1. Open **Resources** → **New Resource**. 1. Select `apify_api_key` resource type. @@ -69,7 +69,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith ![Apify Auth](../images/windmill-install-auth-resource-tab.png) -#### Option b - create/bind from a script +#### Option b - Create/bind from a script 1. Open the script in Windmill UI. 1. Add a secret input parameter (e.g., `apify_token`) . @@ -78,7 +78,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith ![Apify Auth](../images/windmill-install-auth-script.png) -#### Option C - OAuth authentication +#### Option C - OAuth authentication :::note Cloud-only feature @@ -104,7 +104,7 @@ Let's create a simple workflow that runs an Actor and fetches its results. 1. In the Windmill UI, click **New Flow**. 1. Give your flow a descriptive name (e.g., "Run Actor and Get Results"). -### Step 2: add the run Actor script +### Step 2: Add the run Actor script 1. Click **Add Step** and search for "Run Actor". 1. Select the **Run Actor** script. @@ -119,7 +119,7 @@ Let's create a simple workflow that runs an Actor and fetches its results. ![Apify Flow](../images/windmill-flow-run-actor.png) -### Step 3: add the get dataset items script +### Step 3: Add the get dataset items script 1. Add another step and search for "Get Dataset Items". 1. Configure the inputs: diff --git a/sources/platform/integrations/workflows-and-notifications/zapier.md b/sources/platform/integrations/workflows-and-notifications/zapier.md index 85698d68f1..88618a0a88 100644 --- a/sources/platform/integrations/workflows-and-notifications/zapier.md +++ b/sources/platform/integrations/workflows-and-notifications/zapier.md @@ -23,7 +23,7 @@ To use the Apify integration on Zapier, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Zapier account](https://zapier.com/). -### Step 1: create zap and find Apify on Zapier +### Step 1: Create zap and find Apify on Zapier Once you have your Zapier account ready and you are successfully logged in, you can create your first Zap. From ed9994d77e904460431c79e8a62017f58b35c8e2 Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 09:24:48 +0000 Subject: [PATCH 3/9] docs: fix remaining style guide issues from review MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fix all 41 issues identified in Opus review: - Replace double-spaced hyphens ( - ) with single-spaced ( - ) - Fix remaining em dash in store.md - Capitalize "Actor" in image alt text (3 instances) - Capitalize proper nouns in headings: Slack, TikTok, Zap, Google Sheets, GitHub Actions, Bad Gateway - Capitalize "Actor" in module names - Capitalize Option letters (A, B, C) - Fix remaining gerund headings (Extending → Extend, Resetting → Reset) - Fix grammatical issues (Monitor and tracking → Monitor and track) - Restore correct heading (Report process → Reporting process) - Capitalize script names in Step headings All changes ensure compliance with Apify terminology guide and sentence case standards for headings. Co-Authored-By: Claude Sonnet 4.5 --- .../actor_definition/input_schema/specification.md | 2 +- .../development/deployment/continuous_integration.md | 2 +- sources/platform/actors/index.mdx | 2 +- sources/platform/actors/running/input_and_output.md | 2 +- sources/platform/actors/running/store.md | 8 ++++---- sources/platform/console/index.md | 2 +- sources/platform/integrations/ai/chatgpt.md | 4 ++-- sources/platform/integrations/ai/lindy.md | 2 +- .../platform/integrations/integrate_with_apify.md | 2 +- .../workflows-and-notifications/bubble.md | 12 ++++++------ .../workflows-and-notifications/gumloop/tiktok.md | 2 +- .../workflows-and-notifications/kestra.md | 6 +++--- .../workflows-and-notifications/make/index.md | 6 +++--- .../workflows-and-notifications/slack.md | 4 ++-- .../workflows-and-notifications/telegram.md | 2 +- .../workflows-and-notifications/windmill.md | 8 ++++---- .../workflows-and-notifications/zapier.md | 2 +- sources/platform/proxy/usage.md | 2 +- sources/platform/security.md | 2 +- 19 files changed, 36 insertions(+), 36 deletions(-) diff --git a/sources/platform/actors/development/actor_definition/input_schema/specification.md b/sources/platform/actors/development/actor_definition/input_schema/specification.md index 122a723548..af4a1dca87 100644 --- a/sources/platform/actors/development/actor_definition/input_schema/specification.md +++ b/sources/platform/actors/development/actor_definition/input_schema/specification.md @@ -372,7 +372,7 @@ Properties: | Property | Value | Required | Description | |------------|-----------------------------------------------------|----------|-------------------------------------------------------------------------------| -| `type` | One of
  • `integer`
  • `number`
| Yes | Defines the type of the field - either an integer or a floating-point number. | +| `type` | One of
  • `integer`
  • `number`
| Yes | Defines the type of the field - either an integer or a floating-point number. | | `editor` | One of:
  • `number`
  • `hidden`
| No | Visual editor used for input field. | | `maximum` | Integer or Number
(based on the `type`) | No | Maximum allowed value. | | `minimum` | Integer or Number
(based on the `type`) | No | Minimum allowed value. | diff --git a/sources/platform/actors/development/deployment/continuous_integration.md b/sources/platform/actors/development/deployment/continuous_integration.md index b642fbcb4d..93737196c3 100644 --- a/sources/platform/actors/development/deployment/continuous_integration.md +++ b/sources/platform/actors/development/deployment/continuous_integration.md @@ -54,7 +54,7 @@ Choose the method that best fits your workflow. Now your Actor will automatically rebuild on every push to the GitHub repository. -## Option 2: Set up automated builds and tests with GitHub actions +## Option 2: Set up automated builds and tests with GitHub Actions 1. Push your Actor to a GitHub repository. 1. Get your Apify API token from the [Apify Console](https://console.apify.com/settings/integrations) diff --git a/sources/platform/actors/index.mdx b/sources/platform/actors/index.mdx index 25b1c4e938..078f25eed6 100644 --- a/sources/platform/actors/index.mdx +++ b/sources/platform/actors/index.mdx @@ -6,7 +6,7 @@ category: platform slug: /actors --- -**Learn how to run, develop, and publish Apify Actors - serverless cloud programs for web data extraction and workflow automation.** +**Learn how to run, develop, and publish Apify Actors - serverless cloud programs for web data extraction and workflow automation.** import Card from "@site/src/components/Card"; import CardGrid from "@site/src/components/CardGrid"; diff --git a/sources/platform/actors/running/input_and_output.md b/sources/platform/actors/running/input_and_output.md index 920d6b4ab8..ca6fac65a7 100644 --- a/sources/platform/actors/running/input_and_output.md +++ b/sources/platform/actors/running/input_and_output.md @@ -41,7 +41,7 @@ As part of the input, you can also specify run options such as [Build](../develo :::info Dynamic memory -If the Actor is configured by developer to use [dynamic memory](../development/actor_definition/dynamic_actor_memory/index.md), the system will calculate the optimal memory allocation based on your input. In this case, the **Memory** option acts as an override - if you set it, the calculated value will be ignored. +If the Actor is configured by developer to use [dynamic memory](../development/actor_definition/dynamic_actor_memory/index.md), the system will calculate the optimal memory allocation based on your input. In this case, the **Memory** option acts as an override - if you set it, the calculated value will be ignored. ::: diff --git a/sources/platform/actors/running/store.md b/sources/platform/actors/running/store.md index 0d1b96d885..542e2ed8f5 100644 --- a/sources/platform/actors/running/store.md +++ b/sources/platform/actors/running/store.md @@ -25,7 +25,7 @@ All Actors in [Apify Store](https://apify.com/store) fall into one of the four p 1. [**Rental**](#rental-actors) - to continue using the Actor after the trial period, you must rent the Actor from the developer and pay a flat monthly fee in addition to the costs associated with the platform usage that the Actor generates. 2. [**Pay per result**](#pay-per-result) - you do not pay for platform usage the Actor generates and instead just pay for the results it produces. -3. [**Pay per event**](#pay-per-event) - you pay for specific events the Actor creator defines, such as generating a single result or starting the Actor. Most Actors include platform usage in the price, but some may charge it separately — check the Actor's pricing for details. +3. [**Pay per event**](#pay-per-event) - you pay for specific events the Actor creator defines, such as generating a single result or starting the Actor. Most Actors include platform usage in the price, but some may charge it separately - check the Actor's pricing for details. 4. [**Pay per usage**](#pay-per-usage) - you can run the Actor and you pay for the platform usage the Actor generates. ### Rental Actors @@ -181,17 +181,17 @@ In most cases, no - the majority of pay per event Actors include [platform usage You can see how much you have been charged on your invoices, and on the [Usage tab](https://console.apify.com/billing) of the Billing section in the Console. -![Pay per event actor - historical usage tab](./images/store/pay_per_event_historical_usage_tab.png) +![Pay per event Actor - historical usage tab](./images/store/pay_per_event_historical_usage_tab.png) You can also see the cost of each run on the run detail itself. -![Pay per event actor - run detail](./images/store/pay_per_event_price_on_run_detail.png) +![Pay per event Actor - run detail](./images/store/pay_per_event_price_on_run_detail.png) #### Can I put a cap on a cost of a single Actor run? Yes, when starting an Actor run, you can define the maximum limit on the cost of that run. When the Actor reaches the defined limit, it should terminate gracefully. Even if it didn't, for any reason, and kept producing results, we make always sure you are never charged more that your defined limit. -![Pay per event actor - max charge per run](./images/store/pay_per_event_price_on_run_detail.png) +![Pay per event Actor - max charge per run](./images/store/pay_per_event_price_on_run_detail.png) #### How do I raise a dispute if the charges for an Actor seem off? diff --git a/sources/platform/console/index.md b/sources/platform/console/index.md index d0c1e043bd..441daec9fa 100644 --- a/sources/platform/console/index.md +++ b/sources/platform/console/index.md @@ -62,7 +62,7 @@ After you create your account, you might still want to use the other authenticat ![Apify Console sign-in methods section on account page](./images/console-sign-in-methods-section.png) -## Resetting your password +## Reset your password This section also allows you to reset your password if you ever forget it. To do that, click the **Send email to reset password** button. We will then send an email to the address connected to your account with a link to the password reset page. diff --git a/sources/platform/integrations/ai/chatgpt.md b/sources/platform/integrations/ai/chatgpt.md index 96b46addef..6311a92866 100644 --- a/sources/platform/integrations/ai/chatgpt.md +++ b/sources/platform/integrations/ai/chatgpt.md @@ -64,8 +64,8 @@ Once your connector is ready: > “Search the web and summarize recent trends in AI agents” -You’ll need to grant permission for each Apify tool when it’s used for the first time. -You should see ChatGPT calling Apify tools - such as the [RAG Web Browser](https://apify.com/apify/rag-web-browser) - to gather information. +You'll need to grant permission for each Apify tool when it's used for the first time. +You should see ChatGPT calling Apify tools - such as the [RAG Web Browser](https://apify.com/apify/rag-web-browser) - to gather information. ![ChatGPT Apify tools](../images/chatgpt-with-rag-web-browser.png) diff --git a/sources/platform/integrations/ai/lindy.md b/sources/platform/integrations/ai/lindy.md index 76846c5c95..42155e6623 100644 --- a/sources/platform/integrations/ai/lindy.md +++ b/sources/platform/integrations/ai/lindy.md @@ -58,7 +58,7 @@ You have access to thousands of Actors available on the [Apify Store](https://ap This establishes the fundamental workflow:
_Chatting with Lindy can now trigger the Apify Instagram Profile Scraper._ -### Extending your workflow +### Extend your workflow Lindy offers different triggers (e.g., _email received_, _Slack message received_, etc.) and actions beyond running an Actor. diff --git a/sources/platform/integrations/integrate_with_apify.md b/sources/platform/integrations/integrate_with_apify.md index c43b22b4c6..26509ac3b7 100644 --- a/sources/platform/integrations/integrate_with_apify.md +++ b/sources/platform/integrations/integrate_with_apify.md @@ -155,7 +155,7 @@ Users create their own Apify accounts and are billed directly by Apify for their Users access Apify through your platform without needing an Apify account. Apify bills you based on consumption, and you factor costs into your pricing. -### Monitor and tracking +### Monitor and track To help Apify monitor and support your integration, every API request should identify your platform. You can do this in one of two ways: diff --git a/sources/platform/integrations/workflows-and-notifications/bubble.md b/sources/platform/integrations/workflows-and-notifications/bubble.md index 3facf21af5..ed5856b613 100644 --- a/sources/platform/integrations/workflows-and-notifications/bubble.md +++ b/sources/platform/integrations/workflows-and-notifications/bubble.md @@ -166,8 +166,8 @@ There are two common approaches: ### Display data - This example appends the text result of an Actor run; it's a basic bind to the element’s text. -- Create / select the UI visual element - in this example, `Text`. -- In the Appearance tab, click the input area, select Insert dynamic data, and, according to your case, find the source - in this example, it's the `key_value_storages's recordContentText` custom state, where I set the result of the API call +- Create / select the UI visual element - in this example, `Text`. +- In the Appearance tab, click the input area, select Insert dynamic data, and, according to your case, find the source - in this example, it's the `key_value_storages's recordContentText` custom state, where I set the result of the API call - ![Display text data](../images/bubble/text_dynamic_content.png) ### Display list of data @@ -175,13 +175,13 @@ There are two common approaches: - This example lists the current user's datasets and displays them in a repeating group. - Add a **Repeating group** to the page. 1. Add data to a variable: create a custom state (for example, on the page) that will hold the list of datasets, and set it to the plugin's **List User Datasets** data call. - - ![Step 1 - Set variable with user's datasets](../images/bubble/user_dataset_repeating_group_set.png) + - ![Step 1 - Set variable with user's datasets](../images/bubble/user_dataset_repeating_group_set.png) 1. Set the type: in the repeating group's settings, set **Type of content** to match the dataset object your variable returns. - - ![Step 2 - Repeating group type of content](../images/bubble/user_dataset_repeating_group.png) + - ![Step 2 - Repeating group type of content](../images/bubble/user_dataset_repeating_group.png) 1. Bind the variable: set the repeating group's **Data source** to the variable from Step 1. - - ![Step 3 - Repeating group data source](../images/bubble/user_dataset_repeating_group_source.png) + - ![Step 3 - Repeating group data source](../images/bubble/user_dataset_repeating_group_source.png) - Inside the repeating group cell, bind dataset fields (for example, `Current cell's item name`, `id`, `createdAt`). -- ![Step 4 - Repeating group data cell](../images/bubble/user_dataset_repeating_group_cell.png) +- ![Step 4 - Repeating group data cell](../images/bubble/user_dataset_repeating_group_cell.png) ## Long‑running scrapes and Bubble time limits (async pattern) diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md index 92bab2f59d..f473ba2916 100644 --- a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md +++ b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md @@ -25,7 +25,7 @@ You can pull the following types of data from TikTok using Gumloop’s TikTok no | Get video details | Get comprehensive data on a specific TikTok video using its URL - includes engagement and video-level metrics. | 5 credits per item | | Search videos | Search TikTok for videos and users using queries. Returns video details and user profile info. | 3 credits per item | -## Retrieve tiktok data in Gumloop +## Retrieve TikTok data in Gumloop 1. _Add the Gumloop TikTok MCP node_ diff --git a/sources/platform/integrations/workflows-and-notifications/kestra.md b/sources/platform/integrations/workflows-and-notifications/kestra.md index e4efca90e5..5816af65af 100644 --- a/sources/platform/integrations/workflows-and-notifications/kestra.md +++ b/sources/platform/integrations/workflows-and-notifications/kestra.md @@ -1,16 +1,16 @@ --- title: Kestra integration -description: Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events. +description: Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events. sidebar_label: Kestra sidebar_position: 7 slug: /integrations/kestra --- -**Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events.** +**Connect Apify with Kestra to orchestrate workflows - run flows, extract structured data, and react to Actor or task events.** --- -[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data - all defined declaratively in YAML and orchestrated directly from the UI. +[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data - all defined declaratively in YAML and orchestrated directly from the UI. This guide shows you how to set up the integration, configure authentication, and create a workflow that runs an Actor and processes its results. diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md index 952886ef54..c1dd96d3fa 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/index.md +++ b/sources/platform/integrations/workflows-and-notifications/make/index.md @@ -65,7 +65,7 @@ The primary difference between the two methods is that the synchronous run waits In this example, we will demonstrate how to run an Actor synchronously and export the output to Google Sheets. The same principle applies to module that runs a task. -#### Step 1: Add the Apify "Run an actor" module +#### Step 1: Add the Apify "Run an Actor" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Run an Actor" to your scenario and configure it. @@ -84,7 +84,7 @@ You can find this dataset ID in the variables generated by the previous "Run an ![make-com-sync-3.png](../../images/make-com/make-com-sync-3.png) -#### Step 3: Add the Google sheets "Create spreadsheet rows" module +#### Step 3: Add the Google Sheets "Create spreadsheet rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario. This module will automatically create new rows in a Google Sheets file to store the Actor's output. @@ -116,7 +116,7 @@ In the "Dataset ID" field, provide the default dataset ID from the Actor run. Yo ![make-com-async-2.png](../../images/make-com/make-com-async-2.png) -#### Step 3: Add the Google sheets "Create spreadsheet rows" module +#### Step 3: Add the Google Sheets "Create spreadsheet rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario, which will create new rows in the specified Google Sheets file to store the Actor's output. diff --git a/sources/platform/integrations/workflows-and-notifications/slack.md b/sources/platform/integrations/workflows-and-notifications/slack.md index 97e986c654..ed46810a76 100644 --- a/sources/platform/integrations/workflows-and-notifications/slack.md +++ b/sources/platform/integrations/workflows-and-notifications/slack.md @@ -24,7 +24,7 @@ To use the Apify integration for Slack, you will need: - An [Apify account](https://console.apify.com/). - A Slack account (and workspace). -## Step 1: Set up the integration for slack +## Step 1: Set up the integration for Slack You can find all integrations on an Actor's or task's **Integrations** tab. For example, you can try using the [Google Shopping Scraper](https://console.apify.com/actors/aLTexEuCetoJNL9bL). @@ -42,7 +42,7 @@ Once you are done, click the **Save** button. Click the **Start** button and head to the Slack channel you selected to see your first Apify integration notifications. -## Step 3: Start your run directly from slack +## Step 3: Start your run directly from Slack You can now run the same Actor or task directly from Slack by typing `/apify call [Actor or task ID]` into the Slack message box. diff --git a/sources/platform/integrations/workflows-and-notifications/telegram.md b/sources/platform/integrations/workflows-and-notifications/telegram.md index 172a9b473e..39d852d186 100644 --- a/sources/platform/integrations/workflows-and-notifications/telegram.md +++ b/sources/platform/integrations/workflows-and-notifications/telegram.md @@ -27,7 +27,7 @@ To use the Apify integration on Zapier, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Zapier account](https://zapier.com/). -### Step 1: Create zap and find Apify on Zapier +### Step 1: Create Zap and find Apify on Zapier Once you have your Zapier account ready and you are successfully logged in, you can create your first Zap. diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md index 2da5fc9ab0..7a5cb1361a 100644 --- a/sources/platform/integrations/workflows-and-notifications/windmill.md +++ b/sources/platform/integrations/workflows-and-notifications/windmill.md @@ -60,7 +60,7 @@ You can import Apify integration scripts into your flows from the Windmill Hub, You can provide the token to scripts via a **Windmill Resource**. Create it either in the **Resources** tab or directly from a script. -#### Option a - Create in the resources tab +#### Option A - Create in the resources tab 1. Open **Resources** → **New Resource**. 1. Select `apify_api_key` resource type. @@ -69,7 +69,7 @@ You can provide the token to scripts via a **Windmill Resource**. Create it eith ![Apify Auth](../images/windmill-install-auth-resource-tab.png) -#### Option b - Create/bind from a script +#### Option B - Create/bind from a script 1. Open the script in Windmill UI. 1. Add a secret input parameter (e.g., `apify_token`) . @@ -104,7 +104,7 @@ Let's create a simple workflow that runs an Actor and fetches its results. 1. In the Windmill UI, click **New Flow**. 1. Give your flow a descriptive name (e.g., "Run Actor and Get Results"). -### Step 2: Add the run Actor script +### Step 2: Add the Run Actor script 1. Click **Add Step** and search for "Run Actor". 1. Select the **Run Actor** script. @@ -119,7 +119,7 @@ Let's create a simple workflow that runs an Actor and fetches its results. ![Apify Flow](../images/windmill-flow-run-actor.png) -### Step 3: Add the get dataset items script +### Step 3: Add the Get Dataset Items script 1. Add another step and search for "Get Dataset Items". 1. Configure the inputs: diff --git a/sources/platform/integrations/workflows-and-notifications/zapier.md b/sources/platform/integrations/workflows-and-notifications/zapier.md index 88618a0a88..cb56701432 100644 --- a/sources/platform/integrations/workflows-and-notifications/zapier.md +++ b/sources/platform/integrations/workflows-and-notifications/zapier.md @@ -23,7 +23,7 @@ To use the Apify integration on Zapier, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Zapier account](https://zapier.com/). -### Step 1: Create zap and find Apify on Zapier +### Step 1: Create Zap and find Apify on Zapier Once you have your Zapier account ready and you are successfully logged in, you can create your first Zap. diff --git a/sources/platform/proxy/usage.md b/sources/platform/proxy/usage.md index 6aca8e5d1f..f7bce549c8 100644 --- a/sources/platform/proxy/usage.md +++ b/sources/platform/proxy/usage.md @@ -175,7 +175,7 @@ To test that your requests are proxied and IP addresses are being [rotated](/aca https://api.apify.com/v2/browser-info/ -### A different approach to `502 bad gateway` +### A different approach to `502 Bad Gateway` Sometimes when the `502` status code is not comprehensive enough. Therefore, we have modified our server with `590-599` codes instead to provide more insight: diff --git a/sources/platform/security.md b/sources/platform/security.md index 8159647805..52b682ad20 100644 --- a/sources/platform/security.md +++ b/sources/platform/security.md @@ -72,7 +72,7 @@ _We are especially interested in reports that demonstrate:_ - Authentication/authorization issues - Data leaks due to misconfiguration -### Report process +### Reporting process If you notice or suspect a potential security issue, please report it to our security team at [security@apify.com](mailto:security@apify.com) with as much detail as possible, including the following: From 4eaaf38888a242c8b1ed3ca9cef0a56314c82ddc Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 09:52:05 +0000 Subject: [PATCH 4/9] docs: apply all PR review feedback systematically MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Apply all corrections from PR #2243 review comments across entire codebase: Module/Product name capitalization: - Restore proper caps for Make.com modules: "Run an Actor", "Get Dataset Items", "Create Spreadsheet Rows", "Watch Actor Runs" - Restore module names: "Standard Settings Module", "Advanced Settings Module" - Fix product names: GitHub Actions, Google Sheets, Apify Console, Apify Store, Apify Proxy MCP capitalization: - Use "MCP server" (lowercase 's') in prose and headings - Keep "MCP Server" only in code strings and UI field names UI element names: - Capitalize: Build, Timeout, Memory (these are UI element names) Feature names: - "Restricted Access" when referring to the feature - "Data Source" in Keboola integration Revert example data changes: - Restore original capitalization in sample outputs (mastra.md) - Restore em dashes in JSON examples (amazon.md, instagram.md) Additional systematic fixes: - Fix "Apify console" → "Apify Console" (3 files) - Fix "Github" → "GitHub" (integrate_with_apify.md) - Fix "Apify Platform" → "Apify platform" (proxy/usage.md) - Fix "first actor" → "first Actor" (build_with_ai.md) All changes ensure product names, module names, and UI elements are properly capitalized per Apify style guide and user feedback. Co-Authored-By: Claude Sonnet 4.5 --- .../development/quick-start/build_with_ai.md | 6 +++--- .../development/quick-start/start_locally.md | 2 +- .../actors/running/input_and_output.md | 2 +- .../collaboration/general-resource-access.md | 6 +++--- sources/platform/integrations/ai/agno.md | 2 +- sources/platform/integrations/ai/google-adk.md | 4 ++-- sources/platform/integrations/ai/mastra.md | 10 +++++----- sources/platform/integrations/ai/mcp.md | 6 +++--- sources/platform/integrations/ai/skyfire.md | 2 +- .../platform/integrations/ai/vercel-ai-sdk.md | 4 ++-- .../data-storage/airtable/index.md | 2 +- .../integrations/data-storage/keboola.md | 6 +++--- .../integrations/integrate_with_apify.md | 4 ++-- .../make/ai-crawling.md | 6 +++--- .../workflows-and-notifications/make/amazon.md | 2 +- .../workflows-and-notifications/make/index.md | 18 +++++++++--------- .../make/instagram.md | 4 ++-- .../workflows-and-notifications/make/llm.md | 4 ++-- .../workflows-and-notifications/workato.md | 4 ++-- sources/platform/proxy/usage.md | 8 ++++---- 20 files changed, 51 insertions(+), 51 deletions(-) diff --git a/sources/platform/actors/development/quick-start/build_with_ai.md b/sources/platform/actors/development/quick-start/build_with_ai.md index 43da362911..cd092d57fe 100644 --- a/sources/platform/actors/development/quick-start/build_with_ai.md +++ b/sources/platform/actors/development/quick-start/build_with_ai.md @@ -32,7 +32,7 @@ The prompt guides AI coding assistants such as Cursor, Claude Code or GitHub Cop 1. Create directory: `mkdir my-new-actor` 1. Open the directory in _Cursor_, _Claude Code_, _VS Code with GitHub Copilot_, etc. 1. Copy the prompt above and paste it into your AI coding assistant (Agent or Chat) -1. Run it, and develop your first actor with the help of AI +1. Run it, and develop your first Actor with the help of AI :::info Avoid copy-pasting @@ -52,9 +52,9 @@ If you do not have Apify CLI installed, see the [installation guide](/cli/docs/i The command above will guide you through Apify Actor initialization, where you select an Actor Template that works for you. The result is an initialized Actor (with AGENTS.md) ready for development. -## Use Apify MCP Server +## Use Apify MCP server -The Apify MCP Server has tools to search and fetch documentation. If you set it up in your AI editor, it will help you improve the generated code by providing additional context to the AI. +The Apify MCP server has tools to search and fetch documentation. If you set it up in your AI editor, it will help you improve the generated code by providing additional context to the AI. :::info Use Apify MCP server configuration diff --git a/sources/platform/actors/development/quick-start/start_locally.md b/sources/platform/actors/development/quick-start/start_locally.md index 52a0f7ffd9..a107d14db2 100644 --- a/sources/platform/actors/development/quick-start/start_locally.md +++ b/sources/platform/actors/development/quick-start/start_locally.md @@ -143,7 +143,7 @@ Let's now deploy your Actor to the Apify platform, where you can run the Actor o apify push ``` -### Step 5: It's time to iterate! +### Step 5: It's Time to Iterate! Good job! 🎉 You're ready to develop your Actor. You can make changes to your Actor and implement your use case. diff --git a/sources/platform/actors/running/input_and_output.md b/sources/platform/actors/running/input_and_output.md index ca6fac65a7..07186674de 100644 --- a/sources/platform/actors/running/input_and_output.md +++ b/sources/platform/actors/running/input_and_output.md @@ -27,7 +27,7 @@ When running an Actor using the [API](https://docs.apify.com/api/v2) you can pas } ``` -### Options - Build, timeout, and memory +### Options - Build, Timeout, and Memory As part of the input, you can also specify run options such as [Build](../development/builds_and_runs/builds.md), Timeout, and [Memory](./usage_and_resources.md) for your Actor run. diff --git a/sources/platform/collaboration/general-resource-access.md b/sources/platform/collaboration/general-resource-access.md index 4814f201b1..a5d4e38874 100644 --- a/sources/platform/collaboration/general-resource-access.md +++ b/sources/platform/collaboration/general-resource-access.md @@ -28,7 +28,7 @@ Access to resources that require explicit access - such as Actors, tasks or sche ![Setup account-level general resources access setting](./images/general-resouce-access//account-setting.png) -## How restricted access works +## How Restricted Access works If your **General resource access** is set to **Anyone with ID can read**, you can just send this link to anybody, and they will be able to download the data even if they don’t have an Apify account. However, once you change the setting to **Restricted**, this API call will require a valid token with access in order to work. In other words, you’ll have to explicitly share the dataset and you can only do that with people who have an Apify account. @@ -83,7 +83,7 @@ If you’re using a public Actor from Apify Store, you can choose to automatical - When enabled, your runs of public Actors are automatically visible to the Actor’s creator - Shared runs include logs, input, and output storages (dataset, key-value store, request queue) -This sharing works even if your account has **General resource access** set to **Restricted** - the platform applies specific permission checks to ensure the Actor creator can access only the relevant runs. +This sharing works even if your account has **General resource access** set to **Restricted** - the platform applies specific permission checks to ensure the Actor creator can access only the relevant runs. You can disable this behavior at any time by turning off the setting in your account. @@ -107,7 +107,7 @@ This means you don't need to manually adjust permissions or share multiple links ## Per-resource access control -The account level access control can be changed on individual resources. This can be done by setting the general access level to other than Restricted in the share dialog for a given resource. This way the resource level setting takes precedence over the account setting. +The account level access control can be changed on individual resources. This can be done by setting the general access level to other than Restricted in the share dialog for a given resource. This way the resource level setting takes precedence over the account setting. ![Setup resource level access control](./images/general-resouce-access/share-resource-dialog.png) diff --git a/sources/platform/integrations/ai/agno.md b/sources/platform/integrations/ai/agno.md index b0bc22d5d1..802c5dd882 100644 --- a/sources/platform/integrations/ai/agno.md +++ b/sources/platform/integrations/ai/agno.md @@ -26,7 +26,7 @@ This guide shows how to integrate Apify Actors with Agno to empower your AI agen ### Prerequisites -- _Apify API token_: Obtain your API token from the [Apify console](https://console.apify.com/account/integrations). +- _Apify API token_: Obtain your API token from the [Apify Console](https://console.apify.com/account/integrations). - _OpenAI API key_: Get your API key from the [OpenAI platform](https://platform.openai.com/account/api-keys). :::tip Alternative LLM providers diff --git a/sources/platform/integrations/ai/google-adk.md b/sources/platform/integrations/ai/google-adk.md index 168c2e04e4..46e47f59a2 100644 --- a/sources/platform/integrations/ai/google-adk.md +++ b/sources/platform/integrations/ai/google-adk.md @@ -97,6 +97,6 @@ Find a pub near the Ferry Building in San Francisco. - [Apify Actors](https://docs.apify.com/platform/actors) - [Google ADK documentation](https://google.github.io/adk-docs/get-started/) - [What are AI agents?](https://blog.apify.com/what-are-ai-agents/) -- [Apify MCP Server](https://mcp.apify.com) -- [Apify MCP Server documentation](https://docs.apify.com/platform/integrations/mcp) +- [Apify MCP server](https://mcp.apify.com) +- [Apify MCP server documentation](https://docs.apify.com/platform/integrations/mcp) - [Apify OpenRouter proxy](https://apify.com/apify/openrouter) diff --git a/sources/platform/integrations/ai/mastra.md b/sources/platform/integrations/ai/mastra.md index d2ddf2ba25..ad71ba2929 100644 --- a/sources/platform/integrations/ai/mastra.md +++ b/sources/platform/integrations/ai/mastra.md @@ -6,7 +6,7 @@ sidebar_position: 11 slug: /integrations/mastra --- -**Learn how to build AI agents with Mastra and Apify Actors MCP Server.** +**Learn how to build AI agents with Mastra and Apify Actors MCP server.** --- @@ -22,7 +22,7 @@ Check out the [Mastra docs](https://mastra.ai/docs) for more information. ## What is MCP server -A [Model Context Protocol](https://modelcontextprotocol.io) (MCP) server exposes specific data sources or tools to agents via a standardized protocol. It acts as a bridge, connecting large language models (LLMs) to external systems like databases, APIs, or local filesystems. Built on a client-server architecture, MCP servers enable secure, real-time interaction, allowing agents to fetch context or execute actions without custom integrations. Think of it as a modular plugin system for agents, simplifying how they access and process data. Apify provides [Actors MCP Server](https://mcp.apify.com/) to expose [Apify Actors](https://docs.apify.com/platform/actors) from the [Apify Store](https://apify.com/store) as tools via the MCP protocol. +A [Model Context Protocol](https://modelcontextprotocol.io) (MCP) server exposes specific data sources or tools to agents via a standardized protocol. It acts as a bridge, connecting large language models (LLMs) to external systems like databases, APIs, or local filesystems. Built on a client-server architecture, MCP servers enable secure, real-time interaction, allowing agents to fetch context or execute actions without custom integrations. Think of it as a modular plugin system for agents, simplifying how they access and process data. Apify provides [Actors MCP server](https://mcp.apify.com/) to expose [Apify Actors](https://docs.apify.com/platform/actors) from the [Apify Store](https://apify.com/store) as tools via the MCP protocol. ## How to use Apify with Mastra via MCP @@ -124,7 +124,7 @@ await mcpClient.disconnect(); :::note Use any Apify Actor -Since it uses the [Apify MCP Server](https://mcp.apify.com), swap in any Apify Actor from the [Apify Store](https://apify.com/store) by updating the startup request’s `actors` parameter. +Since it uses the [Apify MCP server](https://mcp.apify.com), swap in any Apify Actor from the [Apify Store](https://apify.com/store) by updating the startup request’s `actors` parameter. No other changes are needed in the agent code. ::: @@ -147,7 +147,7 @@ You will see the agent’s output in the console, showing the results of the sea Connecting to Mastra MCP server... Fetching tools... Generating response for prompt: Search the web for the OpenAI TikTok profile URL, then extract and summarize its data. -### OpenAI TikTok profile summary +### OpenAI TikTok Profile Summary - **Profile URL**: [OpenAI on TikTok](https://www.tiktok.com/@openai?lang=en) - **Followers**: 608,100 - **Likes**: 3.4 million - **Videos Posted**: 156 @@ -216,7 +216,7 @@ await mcpClient.disconnect(); - [Apify Actors](https://docs.apify.com/platform/actors) - [Mastra Documentation](https://mastra.ai/docs) -- [Apify MCP Server](https://mcp.apify.com) +- [Apify MCP server](https://mcp.apify.com) - [How to use MCP with Apify Actors](https://blog.apify.com/how-to-use-mcp/) - [Apify Store](https://apify.com/store) - [What are AI Agents?](https://blog.apify.com/what-are-ai-agents/) diff --git a/sources/platform/integrations/ai/mcp.md b/sources/platform/integrations/ai/mcp.md index 56719f774b..cd93057081 100644 --- a/sources/platform/integrations/ai/mcp.md +++ b/sources/platform/integrations/ai/mcp.md @@ -17,7 +17,7 @@ using [Model Context Protocol](https://modelcontextprotocol.io/docs/getting-star discover and run Actors from [Apify Store](https://apify.com/store), access storages and results, and enabled AI coding assistants to access Apify documentation and tutorials. -![Apify MCP Server](../../images/apify_mcp_server.png) +![Apify MCP server](../../images/apify_mcp_server.png) ## Prerequisites @@ -201,7 +201,7 @@ VS Code supports MCP through GitHub Copilot's agent mode (requires Copilot subsc :::tip One-click installation -Download and run the [Apify MCP Server `.mcpb` file](https://github.com/apify/actors-mcp-server/releases/latest/download/apify-mcp-server.mcpb) for one-click installation. +Download and run the [Apify MCP server `.mcpb` file](https://github.com/apify/actors-mcp-server/releases/latest/download/apify-mcp-server.mcpb) for one-click installation. ::: @@ -400,7 +400,7 @@ documentation queries. If you exceed this limit, you'll receive a `429` response ## Support and resources -The Apify MCP Server is an open-source project. Report bugs, suggest features, or ask questions in the [GitHub repository](https://github.com/apify/apify-mcp-server/issues). +The Apify MCP server is an open-source project. Report bugs, suggest features, or ask questions in the [GitHub repository](https://github.com/apify/apify-mcp-server/issues). If you find this project useful, please star it on [GitHub](https://github.com/apify/apify-mcp-server) to show your support! diff --git a/sources/platform/integrations/ai/skyfire.md b/sources/platform/integrations/ai/skyfire.md index 5b464c68b1..cfc1be30ac 100644 --- a/sources/platform/integrations/ai/skyfire.md +++ b/sources/platform/integrations/ai/skyfire.md @@ -25,7 +25,7 @@ Keep in mind that agentic payments are an experimental feature and may undergo s With Skyfire integration, agents can discover available Apify Actors, execute scraping and automation tasks, and pay for services using pre-funded Skyfire tokens, all without human intervention. -## Use Skyfire with Apify MCP Server +## Use Skyfire with Apify MCP server The [Apify MCP server](https://docs.apify.com/platform/integrations/mcp) provides the simplest way for agents to access Apify's Actor library using Skyfire payments. diff --git a/sources/platform/integrations/ai/vercel-ai-sdk.md b/sources/platform/integrations/ai/vercel-ai-sdk.md index 5fb2d22727..c446843267 100644 --- a/sources/platform/integrations/ai/vercel-ai-sdk.md +++ b/sources/platform/integrations/ai/vercel-ai-sdk.md @@ -106,6 +106,6 @@ await mcpClient.close(); - [Apify Actors](https://docs.apify.com/platform/actors) - [Vercel AI SDK documentation](https://ai-sdk.dev/docs/introduction) - [What are AI agents?](https://blog.apify.com/what-are-ai-agents/) -- [Apify MCP Server](https://mcp.apify.com) -- [Apify MCP Server documentation](https://docs.apify.com/platform/integrations/mcp) +- [Apify MCP server](https://mcp.apify.com) +- [Apify MCP server documentation](https://docs.apify.com/platform/integrations/mcp) - [Apify OpenRouter proxy](https://apify.com/apify/openrouter) diff --git a/sources/platform/integrations/data-storage/airtable/index.md b/sources/platform/integrations/data-storage/airtable/index.md index 685c9235c7..d16b82b90d 100644 --- a/sources/platform/integrations/data-storage/airtable/index.md +++ b/sources/platform/integrations/data-storage/airtable/index.md @@ -64,7 +64,7 @@ The extension provides the following capabilities: ### Run Actor -1. Select any Actor from **Apify store** or **recently used Actors** +1. Select any Actor from **Apify Store** or **recently used Actors** ![Select Actor screen](../../images/airtable/airtable_actor_select.png) 1. Fill in the Actor input form. diff --git a/sources/platform/integrations/data-storage/keboola.md b/sources/platform/integrations/data-storage/keboola.md index a09671a0dd..e75ba93576 100644 --- a/sources/platform/integrations/data-storage/keboola.md +++ b/sources/platform/integrations/data-storage/keboola.md @@ -21,7 +21,7 @@ To use the Apify integration on Keboola, you will need to: - Have an [Apify account](https://console.apify.com/). - Have a [Keboola account](https://www.keboola.com/). -### Step 1: Create a new data source in Keboola +### Step 1: Create a new Data Source in Keboola Once your Keboola account is ready and you are logged in, navigate to the **Components** section in the top menu and click the **Add Component** button. @@ -35,7 +35,7 @@ Provide a name and description for your configuration, then click the **Create C ![Keboola configuration setup](../images/keboola/keboola-create-configuration.png) -### Step 2: Configure the Apify data source +### Step 2: Configure the Apify Data Source With the new configuration created, you can now configure the data source to retrieve the needed data. Click on the **Configure Component** button to begin the setup process. @@ -75,7 +75,7 @@ Once you have filled in all the necessary options, click the **Save** button to ![Keboola component specification setup](../images/keboola/keboola-setup-specification.png) -### Step 3: Run the configured data source +### Step 3: Run the configured Data Source After your data source has been configured, you can run it by clicking the **Run** button in the upper-right corner of your configuration. diff --git a/sources/platform/integrations/integrate_with_apify.md b/sources/platform/integrations/integrate_with_apify.md index 26509ac3b7..b4315e7ac5 100644 --- a/sources/platform/integrations/integrate_with_apify.md +++ b/sources/platform/integrations/integrate_with_apify.md @@ -184,12 +184,12 @@ For inspiration, check out the public repositories of Apify's existing external - Zapier - [Zapier integration documentation](https://docs.apify.com/platform/integrations/zapier) - - [Source code on Github](https://github.com/apify/apify-zapier-integration) + - [Source code on GitHub](https://github.com/apify/apify-zapier-integration) - Make.com - [Make.com integration documentation](https://docs.apify.com/platform/integrations/make) - Kestra - [Kestra integration documentation](https://kestra.io/plugins/plugin-apify) - - [Source code on Github](https://github.com/kestra-io/plugin-apify) + - [Source code on GitHub](https://github.com/kestra-io/plugin-apify) - Keboola - [Keboola integration documentation](https://docs.apify.com/platform/integrations/keboola) - [Source code on GitHub](https://github.com/apify/keboola-ex-apify/) (JavaScript) diff --git a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md index f487a70a6b..d1a08ebcf6 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md +++ b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md @@ -35,11 +35,11 @@ To use these modules, you need an [Apify account](https://console.apify.com) and Once connected, you can build workflows to automate website extraction and integrate results into your AI applications. -## Apify Scraper for website content modules +## Apify Scraper for Website Content modules After connecting the app, you can use one of the two modules as native scrapers to extract website content. -### Standard settings module +### Standard Settings Module The Standard Settings module is a streamlined component of the Website Content Crawler that allows you to quickly extract content from websites using optimized default settings. This module is perfect for extracting content from blogs, documentation sites, knowledge bases, or any text-rich website to feed into AI models. @@ -95,7 +95,7 @@ For each crawled web page, you'll receive: } ``` -### Advanced settings module +### Advanced Settings Module The Advanced Settings module provides complete control over the content extraction process, allowing you to fine-tune every aspect of the crawling and transformation pipeline. This module is ideal for complex websites, JavaScript-heavy applications, or when you need precise control over content extraction. diff --git a/sources/platform/integrations/workflows-and-notifications/make/amazon.md b/sources/platform/integrations/workflows-and-notifications/make/amazon.md index cba0b527c1..9743f62744 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/amazon.md +++ b/sources/platform/integrations/workflows-and-notifications/make/amazon.md @@ -113,7 +113,7 @@ For Amazon URLs, you can extract: "reviewsCount": 107637, "thumbnailImage": "https://m.media-amazon.com/images/I/61gSpxZTZZL.__AC_SX300_SY300_QL70_ML2_.jpg", "breadCrumbs": "Electronics›Computers & Accessories›Computer Accessories & Peripherals›Keyboards, Mice & Accessories›Keyboard & Mouse Combos", - "description": "The stylish Logitech MK270 Wireless Keyboard and Mouse Combo is perfect for the home office or workplace. Ditch the touchpad for this full size keyboard and mouse. Easily connect using Logitech's plug and forget receiver - just plug it into the USB port, and you're ready to work. There's no lengthy installation procedure to slow you down. When you're on the move, the receiver stores comfortably inside the mouse. Both the keyboard and mouse included in the MK270 combo use wireless 2.4GHz connectivity to provide seamless, interruption free use. Use the keyboard within a 10 m range without keyboard lag. Work for longer with the MK270's long battery life. The keyboard can be used for up to 24 months, and the mouse for 12 months, without replacing batteries. The Logitech MK270 keyboard includes 8 hotkeys that are programmable to your most used applications to boost your productivity.", + "description": "The stylish Logitech MK270 Wireless Keyboard and Mouse Combo is perfect for the home office or workplace. Ditch the touchpad for this full size keyboard and mouse. Easily connect using Logitech's plug and forget receiver—just plug it into the USB port, and you're ready to work. There's no lengthy installation procedure to slow you down. When you're on the move, the receiver stores comfortably inside the mouse. Both the keyboard and mouse included in the MK270 combo use wireless 2.4GHz connectivity to provide seamless, interruption free use. Use the keyboard within a 10 m range without keyboard lag. Work for longer with the MK270's long battery life. The keyboard can be used for up to 24 months, and the mouse for 12 months, without replacing batteries. The Logitech MK270 keyboard includes 8 hotkeys that are programmable to your most used applications to boost your productivity.", "price": { "value": 21.98, "currency": "$" diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md index c1dd96d3fa..60e14a8996 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/index.md +++ b/sources/platform/integrations/workflows-and-notifications/make/index.md @@ -58,14 +58,14 @@ If you anticipate that the Actor run will exceed the timeout, use the asynchrono ::: -The primary difference between the two methods is that the synchronous run waits for the Actor or task to finish and retrieves its output using the "Get Dataset Items" module. By contrast, the asynchronous run watches for the run of an Actor or task (which could have been triggered from another scenario, manually from Apify console or elsewhere) and gets its output once it finishes. +The primary difference between the two methods is that the synchronous run waits for the Actor or task to finish and retrieves its output using the "Get Dataset Items" module. By contrast, the asynchronous run watches for the run of an Actor or task (which could have been triggered from another scenario, manually from Apify Console or elsewhere) and gets its output once it finishes. ### Synchronous run using the action module In this example, we will demonstrate how to run an Actor synchronously and export the output to Google Sheets. The same principle applies to module that runs a task. -#### Step 1: Add the Apify "Run an Actor" module +#### Step 1: Add the Apify "Run an Actor" Module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Run an Actor" to your scenario and configure it. @@ -75,7 +75,7 @@ Make sure to set the "Run synchronously" option to "Yes," so the module waits fo ![make-com-sync-2.png](../../images/make-com/make-com-sync-2.png) -#### Step 2: Add the Apify "Get dataset items" module +#### Step 2: Add the Apify "Get Dataset Items" Module In the next step, add the "Get Dataset Items" module to your scenario, which is responsible for retrieving the output data from the Actor run. @@ -84,7 +84,7 @@ You can find this dataset ID in the variables generated by the previous "Run an ![make-com-sync-3.png](../../images/make-com/make-com-sync-3.png) -#### Step 3: Add the Google Sheets "Create spreadsheet rows" module +#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" Module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario. This module will automatically create new rows in a Google Sheets file to store the Actor's output. @@ -97,9 +97,9 @@ You’re all set! Once the scenario is started, it will run the Actor synchronou ### Asynchronous run using the trigger module In this example, we will demonstrate how to run an Actor asynchronously and export its output to Google Sheets. -Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify console, on a schedule, or from a separate Make.com scenario. +Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify Console, on a schedule, or from a separate Make.com scenario. -#### Step 1: Add the Apify "Watch Actor runs" module +#### Step 1: Add the Apify "Watch Actor Runs" Module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Watch Actor Runs" to your scenario. This module will set up a webhook to listen for the finished runs of the selected Actor. @@ -108,7 +108,7 @@ For this example, we will use the "Google Maps Review Scraper" Actor. ![make-com-async-1.png](../../images/make-com/make-com-async-1.png) -#### Step 2: Add the Apify "Get dataset items" module +#### Step 2: Add the Apify "Get Dataset Items" Module Add the "Get Dataset Items" module to your scenario to retrieve the output of the Actor run. @@ -116,7 +116,7 @@ In the "Dataset ID" field, provide the default dataset ID from the Actor run. Yo ![make-com-async-2.png](../../images/make-com/make-com-async-2.png) -#### Step 3: Add the Google Sheets "Create spreadsheet rows" module +#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" Module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario, which will create new rows in the specified Google Sheets file to store the Actor's output. @@ -125,7 +125,7 @@ In the "Spreadsheet ID" field, enter the ID of the target Google Sheets file, wh ![make-com-async-3.png](../../images/make-com/make-com-async-3.png) That’s it! Once the Actor run is complete, its data will be exported to the Google Sheets file. -You can initiate the Actor run via the Apify console, a scheduler, or from another Make.com scenario. +You can initiate the Actor run via the Apify Console, a scheduler, or from another Make.com scenario. ## Available modules and triggers diff --git a/sources/platform/integrations/workflows-and-notifications/make/instagram.md b/sources/platform/integrations/workflows-and-notifications/make/instagram.md index 9dd2170d38..1c71f0d07f 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/instagram.md +++ b/sources/platform/integrations/workflows-and-notifications/make/instagram.md @@ -156,7 +156,7 @@ For each Instagram post, you will extract: "timestamp": "2024-11-08T17:30:07.000Z" }, { - "caption": "Take a deep breath...\n\nX-ray images from our Chandra X-ray Observatory helped astronomers confirm that most of the oxygen in the universe is synthesized in massive stars. So, everybody say \"thank you\" to supernova remnants (SNRs) like this one, which has enough oxygen for thousands of solar systems.\n\nSupernova remnants are, naturally, the remains of exploded stars. They're extremely important for understanding our galaxy. If it weren't for SNRs, there would be no Earth, no plants, animals, or people. This is because all the elements heavier than iron were made in a supernova explosion, so the only reason we find these elements on Earth or in our solar system - or any other extrasolar planetary system - is because those elements were formed during a supernova.\n\n@nasachandraxray's data is represented in this image by blue and purple, while optical data from @nasahubble and the Very Large Telescope in Chile are in red and green.\n\nImage description: The darkness of space is almost covered by the array of objects in this image. Stars of different sizes are strewn about, while a blue and red bubble of gas is at the center. An area of pink and green covers the bottom-right corner.\n\nCredit: X-ray (NASA/CXC/ESO/F.Vogt et al); Optical (ESO/VLT/MUSE), Optical (NASA/STScI)\n\n#NASA #Supernova #Space #Universe #Astronomy #Astrophotography #Telescope #Xray", + "caption": "Take a deep breath...\n\nX-ray images from our Chandra X-ray Observatory helped astronomers confirm that most of the oxygen in the universe is synthesized in massive stars. So, everybody say \"thank you\" to supernova remnants (SNRs) like this one, which has enough oxygen for thousands of solar systems.\n\nSupernova remnants are, naturally, the remains of exploded stars. They're extremely important for understanding our galaxy. If it weren't for SNRs, there would be no Earth, no plants, animals, or people. This is because all the elements heavier than iron were made in a supernova explosion, so the only reason we find these elements on Earth or in our solar system — or any other extrasolar planetary system — is because those elements were formed during a supernova.\n\n@nasachandraxray's data is represented in this image by blue and purple, while optical data from @nasahubble and the Very Large Telescope in Chile are in red and green.\n\nImage description: The darkness of space is almost covered by the array of objects in this image. Stars of different sizes are strewn about, while a blue and red bubble of gas is at the center. An area of pink and green covers the bottom-right corner.\n\nCredit: X-ray (NASA/CXC/ESO/F.Vogt et al); Optical (ESO/VLT/MUSE), Optical (NASA/STScI)\n\n#NASA #Supernova #Space #Universe #Astronomy #Astrophotography #Telescope #Xray", "ownerFullName": "NASA", "ownerUsername": "nasa", "url": "https://www.instagram.com/p/DBKBByizDHZ/", @@ -166,7 +166,7 @@ For each Instagram post, you will extract: "timestamp": "2024-10-15T19:27:29.000Z" }, { - "caption": "It’s giving rainbows and unicorns, like a middle school binder 🦄🌈 ⁣⁣\n⁣⁣\nMeet NGC 602, a young star cluster in the Small Magellanic Cloud (one of our satellite galaxies), where astronomers using @NASAWebb have found candidates for the first brown dwarfs outside of our galaxy. This star cluster has a similar environment to the kinds of star-forming regions that would have existed in the early universe - with very low amounts of elements heavier than hydrogen and helium. It’s drastically different from our own solar neighborhood and close enough to study in detail. ⁣⁣\n ⁣⁣\nBrown dwarfs are… not quite stars, but also not quite gas giant planets either. Typically they range from about 13 to 75 Jupiter masses. They are sometimes free-floating and not gravitationally bound to a star, like a planet would be. But they do share some characteristics with exoplanets, like storm patterns and atmospheric composition. ⁣⁣\n\n@NASAHubble showed us that NGC 602 harbors some very young low-mass stars; Webb is showing us how significant and extensive objects like brown dwarfs are in this cluster. Scientists are excited to better be able to understand how they form, particularly in an environment similar to the harsh conditions of the early universe.⁣⁣\n ⁣⁣\nRead more at the link in @ESAWebb’s bio. ⁣⁣\n ⁣⁣\nImage description: A two image swipe-through of a star cluster is shown inside a large nebula of many-coloured gas and dust. The material forms dark ridges and peaks of gas and dust surrounding the cluster, lit on the inner side, while layers of diffuse, translucent clouds blanket over them. Around and within the gas, a huge number of distant galaxies can be seen, some quite large, as well as a few stars nearer to us which are very large and bright.⁣⁣\n ⁣⁣\nImage Credit: ESA/Webb, NASA & CSA, P. Zeidler, E. Sabbi, A. Nota, M. Zamani (ESA/Webb)⁣⁣\n ⁣⁣\n#JWST #Webb #JamesWebbSpaceTelescope #NGC602 #browndwarf #space #NASA #ESA", + "caption": "It’s giving rainbows and unicorns, like a middle school binder 🦄🌈 ⁣⁣\n⁣⁣\nMeet NGC 602, a young star cluster in the Small Magellanic Cloud (one of our satellite galaxies), where astronomers using @NASAWebb have found candidates for the first brown dwarfs outside of our galaxy. This star cluster has a similar environment to the kinds of star-forming regions that would have existed in the early universe—with very low amounts of elements heavier than hydrogen and helium. It’s drastically different from our own solar neighborhood and close enough to study in detail. ⁣⁣\n ⁣⁣\nBrown dwarfs are… not quite stars, but also not quite gas giant planets either. Typically they range from about 13 to 75 Jupiter masses. They are sometimes free-floating and not gravitationally bound to a star, like a planet would be. But they do share some characteristics with exoplanets, like storm patterns and atmospheric composition. ⁣⁣\n\n@NASAHubble showed us that NGC 602 harbors some very young low-mass stars; Webb is showing us how significant and extensive objects like brown dwarfs are in this cluster. Scientists are excited to better be able to understand how they form, particularly in an environment similar to the harsh conditions of the early universe.⁣⁣\n ⁣⁣\nRead more at the link in @ESAWebb’s bio. ⁣⁣\n ⁣⁣\nImage description: A two image swipe-through of a star cluster is shown inside a large nebula of many-coloured gas and dust. The material forms dark ridges and peaks of gas and dust surrounding the cluster, lit on the inner side, while layers of diffuse, translucent clouds blanket over them. Around and within the gas, a huge number of distant galaxies can be seen, some quite large, as well as a few stars nearer to us which are very large and bright.⁣⁣\n ⁣⁣\nImage Credit: ESA/Webb, NASA & CSA, P. Zeidler, E. Sabbi, A. Nota, M. Zamani (ESA/Webb)⁣⁣\n ⁣⁣\n#JWST #Webb #JamesWebbSpaceTelescope #NGC602 #browndwarf #space #NASA #ESA", "ownerFullName": "NASA", "ownerUsername": "nasa", "url": "https://www.instagram.com/p/DBea8-8Jn2z/", diff --git a/sources/platform/integrations/workflows-and-notifications/make/llm.md b/sources/platform/integrations/workflows-and-notifications/make/llm.md index b806090cae..da7f429381 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/llm.md +++ b/sources/platform/integrations/workflows-and-notifications/make/llm.md @@ -39,7 +39,7 @@ Once connected, you can build workflows that search the web, extract content, an After connecting the app, you can use two modules to search and extract content. -### Standard Settings module +### Standard Settings Module Use Standard Settings to quickly search the web and extract content with optimized defaults. This is ideal for AI agents that need to answer questions or gather information from multiple sources. @@ -97,7 +97,7 @@ When you provide keywords, the module runs Google Search, parses the results, an - _Remove cookie warnings_: Dismiss cookie consent dialogs - _Debug mode_: Enable extraction diagnostics -### Advanced Settings module +### Advanced Settings Module Advanced Settings give you full control over search and extraction. Use it for complex sites or production RAG pipelines. diff --git a/sources/platform/integrations/workflows-and-notifications/workato.md b/sources/platform/integrations/workflows-and-notifications/workato.md index 3dc3a2cd01..86051a6ad2 100644 --- a/sources/platform/integrations/workflows-and-notifications/workato.md +++ b/sources/platform/integrations/workflows-and-notifications/workato.md @@ -150,7 +150,7 @@ _Triggers when an Apify Actor run finishes (succeeds, fails, times out, or gets This trigger monitors a specific Apify Actor and starts the recipe when any run of that Actor reaches a terminal status. You can: -- Select the Actor from recently used Actors or Apify store Actors +- Select the Actor from recently used Actors or Apify Store Actors - Choose to trigger on specific statuses (`ACTOR.RUN.SUCCEEDED`, `ACTOR.RUN.FAILED`, `ACTOR.RUN.TIMED_OUT`, `ACTOR.RUN.ABORTED`) - Access run details, status, and metadata in subsequent recipe steps @@ -178,7 +178,7 @@ _Run an Apify Actor with customizable execution parameters._ This action runs an Apify Actor with your specified input and execution parameters. You can choose to wait for completion or start the run asynchronously. Actors are reusable serverless programs that can scrape websites, process data, and automate workflows. You can: -- Select from your recently used Actors or Apify store Actors +- Select from your recently used Actors or Apify Store Actors - Provide input using dynamic schema-based fields or raw JSON - Configure run options like memory allocation, timeout, and build version - Choose between synchronous (wait for completion) or asynchronous execution diff --git a/sources/platform/proxy/usage.md b/sources/platform/proxy/usage.md index f7bce549c8..da67e34185 100644 --- a/sources/platform/proxy/usage.md +++ b/sources/platform/proxy/usage.md @@ -25,7 +25,7 @@ All usage of Apify Proxy with your password is charged towards your account. Do ### External connection -If you want to connect to Apify Proxy from outside of the Apify Platform, you need to have a paid Apify plan (to prevent abuse). +If you want to connect to Apify Proxy from outside of the Apify platform, you need to have a paid Apify plan (to prevent abuse). If you need to test Apify Proxy before you subscribe, please [contact our support](https://apify.com/contact). | Parameter | Value / explanation | @@ -36,7 +36,7 @@ If you need to test Apify Proxy before you subscribe, please [contact our suppor | Password | Apify Proxy password. Your password is displayed on the [Proxy](https://console.apify.com/proxy/groups) page in Apify Console.
**Note**: this is not your Apify account password. | :::caution External connections -If you use these connection parameters for connecting to Apify Proxy from your Actors running on the Apify Platform, the connection will still be considered external, it will not work on the Free plan, and on paid plans you will be charged for external data transfer. Please use the connection parameters from the [Connection from Actors](#connection-from-actors) section when using Apify Proxy from Actors. +If you use these connection parameters for connecting to Apify Proxy from your Actors running on the Apify platform, the connection will still be considered external, it will not work on the Free plan, and on paid plans you will be charged for external data transfer. Please use the connection parameters from the [Connection from Actors](#connection-from-actors) section when using Apify Proxy from Actors. ::: Example connection string for external connections: @@ -47,7 +47,7 @@ http://auto:apify_proxy_EaAFg6CFhc4eKk54Q1HbGDEiUTrk480uZv03@proxy.apify.com:800 ### Connection from Actors -If you want to connect to Apify Proxy from Actors running on the Apify Platform, the recommended way is to use built-in proxy configuration tools in the [Apify SDK JavaScript](/sdk/js/docs/guides/proxy-management) or [Apify SDK Python](/sdk/python/docs/concepts/proxy-management) +If you want to connect to Apify Proxy from Actors running on the Apify platform, the recommended way is to use built-in proxy configuration tools in the [Apify SDK JavaScript](/sdk/js/docs/guides/proxy-management) or [Apify SDK Python](/sdk/python/docs/concepts/proxy-management) If you don't want to use these helpers, and want to connect to Apify Proxy manually, you can find the right configuration values in [environment variables](../actors/development/programming_interface/environment_variables.md) provided to the Actor. By using this configuration, you ensure that you connect to Apify Proxy directly through the Apify infrastructure, bypassing any external connection via the Internet, thereby improving the connection speed, and ensuring you don't pay for external data transfer. @@ -169,7 +169,7 @@ If you need to allow communication to `apify.proxy.com`, add the following IP ad To view your connection status to [Apify Proxy](https://apify.com/proxy), open the URL below in the browser using the proxy. [http://proxy.apify.com/](http://proxy.apify.com/). If the proxy connection is working, the page should look something like this: -![Apify proxy status page](./images/proxy-status.png) +![Apify Proxy status page](./images/proxy-status.png) To test that your requests are proxied and IP addresses are being [rotated](/academy/anti-scraping/techniques) correctly, open the following API endpoint via the proxy. It shows information about the client IP address. From 477214cd4f490fbac394586b2cd7f62cd0e1c37a Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 10:02:48 +0000 Subject: [PATCH 5/9] docs: fix final capitalization inconsistencies Fix 4 remaining issues from final review: 1. maps.md: Fix heading capitalization - "Search" lowercase, "Leads" capitalized in product name "Google Maps Leads Scraper" 2. windmill.md: Restore "Windmill Hub" product name capitalization 3. n8n/index.md: Fix inconsistent "Node" capitalization - use lowercase "node" consistently across all headings 4. zapier.md: Fix "Finished Actor Run" to "Finished Actor run" for consistency with "Finished task run" All changes ensure proper sentence case with correct capitalization of product names and consistent terminology throughout. Co-Authored-By: Claude Sonnet 4.5 --- .../integrations/workflows-and-notifications/make/maps.md | 2 +- .../integrations/workflows-and-notifications/n8n/index.md | 4 ++-- .../integrations/workflows-and-notifications/windmill.md | 2 +- .../integrations/workflows-and-notifications/zapier.md | 2 +- 4 files changed, 5 insertions(+), 5 deletions(-) diff --git a/sources/platform/integrations/workflows-and-notifications/make/maps.md b/sources/platform/integrations/workflows-and-notifications/make/maps.md index 19d83db44f..fbb8fa5cf3 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/maps.md +++ b/sources/platform/integrations/workflows-and-notifications/make/maps.md @@ -146,7 +146,7 @@ This module allows you to enter search terms that match what you would typically The search results can be further refined using optional category filters, which help ensure you're capturing precisely the type of businesses you're targeting. For maximum efficiency, you can combine broader search terms with strategic category filters to capture the most relevant leads without excluding valuable prospects. -### Advanced and custom Search module - Google Maps leads Scraper +### Advanced and custom search module - Google Maps Leads Scraper The Advanced and Custom Search module is the most powerful component of the Google Maps Leads Scraper, designed for sophisticated lead generation campaigns that require precise geographic targeting and advanced search capabilities. This module gives you complete control over your lead discovery process with multiple location definition methods and advanced filtering options. diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/index.md b/sources/platform/integrations/workflows-and-notifications/n8n/index.md index aeacf2cbd6..65b5b90ffe 100644 --- a/sources/platform/integrations/workflows-and-notifications/n8n/index.md +++ b/sources/platform/integrations/workflows-and-notifications/n8n/index.md @@ -21,7 +21,7 @@ Before you begin, make sure you have: - An [Apify account](https://console.apify.com/) - An [n8n instance](https://docs.n8n.io/learning-path/) (self‑hosted or cloud) -## Install the Apify Node (self-hosted) +## Install the Apify node (self-hosted) If you're running a self-hosted n8n instance, you can install the Apify community node directly from the editor. This process adds the node to your available tools, enabling Apify operations in workflows. @@ -130,7 +130,7 @@ Actions allow you to perform operations like running an Actor within a workflow. 1. Save and execute the workflow ![Apify Node](../../images/n8n-workflow-example.png) -## Use Apify Node as an AI tool +## Use Apify node as an AI tool You can run Apify operations, retrieve the results, and use AI to process, analyze, and summarize the data, or generate insights and recommendations. diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md index 7a5cb1361a..43c8f086ee 100644 --- a/sources/platform/integrations/workflows-and-notifications/windmill.md +++ b/sources/platform/integrations/workflows-and-notifications/windmill.md @@ -28,7 +28,7 @@ The Apify integration provides scripts, flows, and resources that will be availa ![Apify Hub](../images/windmill-install-hub.png) -### Step 1: Import Apify scripts from Windmill hub +### Step 1: Import Apify scripts from Windmill Hub You can import Apify integration scripts into your flows from the Windmill Hub, regardless of whether you're using Windmill Cloud or a self-hosted instance. The following components will be available: diff --git a/sources/platform/integrations/workflows-and-notifications/zapier.md b/sources/platform/integrations/workflows-and-notifications/zapier.md index cb56701432..c20c564b26 100644 --- a/sources/platform/integrations/workflows-and-notifications/zapier.md +++ b/sources/platform/integrations/workflows-and-notifications/zapier.md @@ -94,7 +94,7 @@ Once you are happy with the test, you can publish the Zap. When it is turned on, ## Triggers -### Finished Actor Run +### Finished Actor run > Triggers when a selected Actor run is finished. From 1a5233fbdd0e413c9b94e28615a419b97ddf0c9c Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 13:48:48 +0000 Subject: [PATCH 6/9] docs: resolve PR review comments on capitalization Preserve proper nouns, product names, UI elements, and module names while keeping generic terms lowercase per reviewer feedback. Co-Authored-By: Claude Opus 4.6 --- .../actors/development/quick-start/start_locally.md | 2 +- .../workflows-and-notifications/make/ai-crawling.md | 4 ++-- .../workflows-and-notifications/make/index.md | 12 ++++++------ .../workflows-and-notifications/make/llm.md | 4 ++-- .../workflows-and-notifications/make/maps.md | 4 ++-- .../workflows-and-notifications/n8n/index.md | 2 +- .../workflows-and-notifications/windmill.md | 2 +- .../workflows-and-notifications/workato.md | 4 ++-- 8 files changed, 17 insertions(+), 17 deletions(-) diff --git a/sources/platform/actors/development/quick-start/start_locally.md b/sources/platform/actors/development/quick-start/start_locally.md index a107d14db2..52a0f7ffd9 100644 --- a/sources/platform/actors/development/quick-start/start_locally.md +++ b/sources/platform/actors/development/quick-start/start_locally.md @@ -143,7 +143,7 @@ Let's now deploy your Actor to the Apify platform, where you can run the Actor o apify push ``` -### Step 5: It's Time to Iterate! +### Step 5: It's time to iterate! Good job! 🎉 You're ready to develop your Actor. You can make changes to your Actor and implement your use case. diff --git a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md index d1a08ebcf6..bb37bf09d9 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md +++ b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md @@ -39,7 +39,7 @@ Once connected, you can build workflows to automate website extraction and integ After connecting the app, you can use one of the two modules as native scrapers to extract website content. -### Standard Settings Module +### Standard Settings module The Standard Settings module is a streamlined component of the Website Content Crawler that allows you to quickly extract content from websites using optimized default settings. This module is perfect for extracting content from blogs, documentation sites, knowledge bases, or any text-rich website to feed into AI models. @@ -95,7 +95,7 @@ For each crawled web page, you'll receive: } ``` -### Advanced Settings Module +### Advanced Settings module The Advanced Settings module provides complete control over the content extraction process, allowing you to fine-tune every aspect of the crawling and transformation pipeline. This module is ideal for complex websites, JavaScript-heavy applications, or when you need precise control over content extraction. diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md index 60e14a8996..266b1ead11 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/index.md +++ b/sources/platform/integrations/workflows-and-notifications/make/index.md @@ -65,7 +65,7 @@ The primary difference between the two methods is that the synchronous run waits In this example, we will demonstrate how to run an Actor synchronously and export the output to Google Sheets. The same principle applies to module that runs a task. -#### Step 1: Add the Apify "Run an Actor" Module +#### Step 1: Add the Apify "Run an Actor" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Run an Actor" to your scenario and configure it. @@ -75,7 +75,7 @@ Make sure to set the "Run synchronously" option to "Yes," so the module waits fo ![make-com-sync-2.png](../../images/make-com/make-com-sync-2.png) -#### Step 2: Add the Apify "Get Dataset Items" Module +#### Step 2: Add the Apify "Get Dataset Items" module In the next step, add the "Get Dataset Items" module to your scenario, which is responsible for retrieving the output data from the Actor run. @@ -84,7 +84,7 @@ You can find this dataset ID in the variables generated by the previous "Run an ![make-com-sync-3.png](../../images/make-com/make-com-sync-3.png) -#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" Module +#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario. This module will automatically create new rows in a Google Sheets file to store the Actor's output. @@ -99,7 +99,7 @@ You’re all set! Once the scenario is started, it will run the Actor synchronou In this example, we will demonstrate how to run an Actor asynchronously and export its output to Google Sheets. Before starting, decide where you want to initiate the Actor run. You can do this manually via the Apify Console, on a schedule, or from a separate Make.com scenario. -#### Step 1: Add the Apify "Watch Actor Runs" Module +#### Step 1: Add the Apify "Watch Actor Runs" module First, ensure that you have [connected your Apify account to Make.com](#create-a-connection-to-apify). Next, add the Apify module called "Watch Actor Runs" to your scenario. This module will set up a webhook to listen for the finished runs of the selected Actor. @@ -108,7 +108,7 @@ For this example, we will use the "Google Maps Review Scraper" Actor. ![make-com-async-1.png](../../images/make-com/make-com-async-1.png) -#### Step 2: Add the Apify "Get Dataset Items" Module +#### Step 2: Add the Apify "Get Dataset Items" module Add the "Get Dataset Items" module to your scenario to retrieve the output of the Actor run. @@ -116,7 +116,7 @@ In the "Dataset ID" field, provide the default dataset ID from the Actor run. Yo ![make-com-async-2.png](../../images/make-com/make-com-async-2.png) -#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" Module +#### Step 3: Add the Google Sheets "Create Spreadsheet Rows" module Finally, add the Google Sheets "Bulk Add Rows" module to your scenario, which will create new rows in the specified Google Sheets file to store the Actor's output. diff --git a/sources/platform/integrations/workflows-and-notifications/make/llm.md b/sources/platform/integrations/workflows-and-notifications/make/llm.md index da7f429381..b806090cae 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/llm.md +++ b/sources/platform/integrations/workflows-and-notifications/make/llm.md @@ -39,7 +39,7 @@ Once connected, you can build workflows that search the web, extract content, an After connecting the app, you can use two modules to search and extract content. -### Standard Settings Module +### Standard Settings module Use Standard Settings to quickly search the web and extract content with optimized defaults. This is ideal for AI agents that need to answer questions or gather information from multiple sources. @@ -97,7 +97,7 @@ When you provide keywords, the module runs Google Search, parses the results, an - _Remove cookie warnings_: Dismiss cookie consent dialogs - _Debug mode_: Enable extraction diagnostics -### Advanced Settings Module +### Advanced Settings module Advanced Settings give you full control over search and extraction. Use it for complex sites or production RAG pipelines. diff --git a/sources/platform/integrations/workflows-and-notifications/make/maps.md b/sources/platform/integrations/workflows-and-notifications/make/maps.md index fbb8fa5cf3..9cd01ae50b 100644 --- a/sources/platform/integrations/workflows-and-notifications/make/maps.md +++ b/sources/platform/integrations/workflows-and-notifications/make/maps.md @@ -125,7 +125,7 @@ Categories can be general (e.g., "restaurant") which includes all variations lik } ``` -### Search with Search terms module +### Search with Search Terms module The Search Terms module is a component of the Google Maps Leads Scraper designed to discover and extract business leads by using specific search queries, similar to how you'd search on Google Maps directly. @@ -146,7 +146,7 @@ This module allows you to enter search terms that match what you would typically The search results can be further refined using optional category filters, which help ensure you're capturing precisely the type of businesses you're targeting. For maximum efficiency, you can combine broader search terms with strategic category filters to capture the most relevant leads without excluding valuable prospects. -### Advanced and custom search module - Google Maps Leads Scraper +### Advanced and Custom Search module - Google Maps Leads Scraper The Advanced and Custom Search module is the most powerful component of the Google Maps Leads Scraper, designed for sophisticated lead generation campaigns that require precise geographic targeting and advanced search capabilities. This module gives you complete control over your lead discovery process with multiple location definition methods and advanced filtering options. diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/index.md b/sources/platform/integrations/workflows-and-notifications/n8n/index.md index 65b5b90ffe..11a170db86 100644 --- a/sources/platform/integrations/workflows-and-notifications/n8n/index.md +++ b/sources/platform/integrations/workflows-and-notifications/n8n/index.md @@ -34,7 +34,7 @@ If you're running a self-hosted n8n instance, you can install the Apify communit ![Apify Install Node](../../images/n8n-install-node-self-hosted.png) -## Install the Apify node (n8n cloud) +## Install the Apify node (n8n Cloud) For n8n Cloud users, installation is even simpler and doesn't require manual package entry. Just search and add the node from the canvas. diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md index 43c8f086ee..bc71fe17a1 100644 --- a/sources/platform/integrations/workflows-and-notifications/windmill.md +++ b/sources/platform/integrations/workflows-and-notifications/windmill.md @@ -60,7 +60,7 @@ You can import Apify integration scripts into your flows from the Windmill Hub, You can provide the token to scripts via a **Windmill Resource**. Create it either in the **Resources** tab or directly from a script. -#### Option A - Create in the resources tab +#### Option A - Create in the Resources tab 1. Open **Resources** → **New Resource**. 1. Select `apify_api_key` resource type. diff --git a/sources/platform/integrations/workflows-and-notifications/workato.md b/sources/platform/integrations/workflows-and-notifications/workato.md index 86051a6ad2..200ffe7acf 100644 --- a/sources/platform/integrations/workflows-and-notifications/workato.md +++ b/sources/platform/integrations/workflows-and-notifications/workato.md @@ -144,7 +144,7 @@ Each connector trigger and action field in Workato includes inline help text des The Apify connector provides the following triggers that monitor your Apify account for task completions: -### Actor run finished +### Actor Run Finished _Triggers when an Apify Actor run finishes (succeeds, fails, times out, or gets aborted)._ @@ -156,7 +156,7 @@ This trigger monitors a specific Apify Actor and starts the recipe when any run ![Screenshot of the Actor Run Finished trigger configuration in Workato](../images/workato/trigger-actor.png) -### Task run finished +### Task Run Finished _Triggers when an Apify Task run finishes (succeeds, fails, times out, or gets aborted)._ From a047786fe821fb122a3ba0b6fc98d816580d39e9 Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 13:53:45 +0000 Subject: [PATCH 7/9] docs: fix markdown lint and vale errors MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fix table column alignment in runs_and_builds.md and correct reCaptcha → reCAPTCHA in console/index.md. Co-Authored-By: Claude Opus 4.6 --- sources/platform/actors/running/runs_and_builds.md | 4 ++-- sources/platform/console/index.md | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/sources/platform/actors/running/runs_and_builds.md b/sources/platform/actors/running/runs_and_builds.md index b2f33dfb70..52d81a0fdf 100644 --- a/sources/platform/actors/running/runs_and_builds.md +++ b/sources/platform/actors/running/runs_and_builds.md @@ -88,8 +88,8 @@ flowchart LR | FAILED | terminal | Run failed | | TIMING-OUT | transitional | Timing out now | | TIMED-OUT | terminal | Timed out | -| ABORTING | transitional | Being aborted by the user | -| ABORTED | terminal | Aborted by the user | +| ABORTING | transitional | Being aborted by the user | +| ABORTED | terminal | Aborted by the user | ### Aborting runs diff --git a/sources/platform/console/index.md b/sources/platform/console/index.md index 441daec9fa..4675303835 100644 --- a/sources/platform/console/index.md +++ b/sources/platform/console/index.md @@ -24,7 +24,7 @@ This is the most common way of creating an account. You just need to provide you After you click the **Sign up** button, we will send you a verification email. The email contains a link that you need to click on or copy to your browser to proceed to automated email verification. After we verify your email, you will proceed to Apify Console. :::info CAPTCHA -We are using Google reCaptcha to prevent spam accounts. Usually, you will not see it, but if Google evaluates your browser as suspicious, they will ask you to solve a reCaptcha before we create your account and send you the verification email. +We are using Google reCAPTCHA to prevent spam accounts. Usually, you will not see it, but if Google evaluates your browser as suspicious, they will ask you to solve a reCAPTCHA before we create your account and send you the verification email. ::: If you did not receive the email, you can visit the [sign-in page](https://console.apify.com/sign-in). There, you will either proceed to our verification page right away, or you can sign in and will be redirected afterward. On the verification page, you can click on the **Resend verification email** button to send the email again. From afa5bf75626e2bcafc8de506ec0772077f9105e3 Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 13:59:56 +0000 Subject: [PATCH 8/9] docs: fix table alignment in console/index.md Align table pipes consistently for MD060 compliance. Co-Authored-By: Claude Opus 4.6 --- sources/platform/console/index.md | 54 +++++++++++++++---------------- 1 file changed, 27 insertions(+), 27 deletions(-) diff --git a/sources/platform/console/index.md b/sources/platform/console/index.md index 4675303835..b740c203fd 100644 --- a/sources/platform/console/index.md +++ b/sources/platform/console/index.md @@ -94,34 +94,34 @@ You can also navigate Apify Console via keyboard shortcuts.
Keyboard Shortcuts -|Shortcut| Tab | -|:---|:----| -|Show shortcuts | Shift? | -|Home| GH | -|Store| GO | -|Actors| GA | -|Development| GD | -|Saved tasks| GT | -|Runs| GR | -|Integrations | GI | -|Schedules| GU | -|Storage| GE | -|Proxy| GP | -|Settings| GS | -|Billing| GB | +| Shortcut | Tab | +| :--- | :--- | +| Show shortcuts | Shift? | +| Home | GH | +| Store | GO | +| Actors | GA | +| Development | GD | +| Saved tasks | GT | +| Runs | GR | +| Integrations | GI | +| Schedules | GU | +| Storage | GE | +| Proxy | GP | +| Settings | GS | +| Billing | GB |
| Tab name | Description | -|:---|:---| -| [Apify Store](/platform/console/store)| Search for Actors that suit your web-scraping needs. | -| [Actors](/platform/actors)| View recent & bookmarked Actors. | -| [Runs](/platform/actors/running/runs-and-builds)| View your recent runs. | -| [Saved tasks](/platform/actors/running/tasks)| View your saved tasks. | -| [Schedules](/platform/schedules)| Schedule Actor runs & tasks to run at specified time. | -| [Integrations](/platform/integrations)| View your integrations. | -| [Development](/platform/actors/development)| • My Actors - See Actors developed by you.
• Insights - see analytics for your Actors.
• Messaging - check on issues reported in your Actors or send emails to users of your Actors. | -| [Proxy](/platform/proxy)| View your proxy usage & credentials | -| [Storage](/platform/storage)| View stored results of your runs in various data formats. | -| [Billing](/platform/console/billing)| Billing information, statistics and invoices. | -| [Settings](/platform/console/settings)| Settings of your account. | +| :--- | :--- | +| [Apify Store](/platform/console/store) | Search for Actors that suit your web-scraping needs. | +| [Actors](/platform/actors) | View recent & bookmarked Actors. | +| [Runs](/platform/actors/running/runs-and-builds) | View your recent runs. | +| [Saved tasks](/platform/actors/running/tasks) | View your saved tasks. | +| [Schedules](/platform/schedules) | Schedule Actor runs & tasks to run at specified time. | +| [Integrations](/platform/integrations) | View your integrations. | +| [Development](/platform/actors/development) | • My Actors - See Actors developed by you.
• Insights - see analytics for your Actors.
• Messaging - check on issues reported in your Actors or send emails to users of your Actors. | +| [Proxy](/platform/proxy) | View your proxy usage & credentials | +| [Storage](/platform/storage) | View stored results of your runs in various data formats. | +| [Billing](/platform/console/billing) | Billing information, statistics and invoices. | +| [Settings](/platform/console/settings) | Settings of your account. | From 6022d97f3741e2e68b69a5b4a6525828ca8b3f14 Mon Sep 17 00:00:00 2001 From: Marcel Rebro Date: Thu, 12 Feb 2026 14:04:11 +0000 Subject: [PATCH 9/9] docs: fix table alignment in proxy/usage.md Normalize table pipe spacing for MD060 compliance. Co-Authored-By: Claude Opus 4.6 --- sources/platform/proxy/usage.md | 24 ++++++++++++------------ 1 file changed, 12 insertions(+), 12 deletions(-) diff --git a/sources/platform/proxy/usage.md b/sources/platform/proxy/usage.md index da67e34185..86d5a83898 100644 --- a/sources/platform/proxy/usage.md +++ b/sources/platform/proxy/usage.md @@ -28,12 +28,12 @@ All usage of Apify Proxy with your password is charged towards your account. Do If you want to connect to Apify Proxy from outside of the Apify platform, you need to have a paid Apify plan (to prevent abuse). If you need to test Apify Proxy before you subscribe, please [contact our support](https://apify.com/contact). -| Parameter | Value / explanation | -|---------------------|---------------------| -| Hostname | `proxy.apify.com`| -| Port | `8000` | -| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username.| -| Password | Apify Proxy password. Your password is displayed on the [Proxy](https://console.apify.com/proxy/groups) page in Apify Console.
**Note**: this is not your Apify account password. | +| Parameter | Value / explanation | +| :--- | :--- | +| Hostname | `proxy.apify.com` | +| Port | `8000` | +| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username. | +| Password | Apify Proxy password. Your password is displayed on the [Proxy](https://console.apify.com/proxy/groups) page in Apify Console.
**Note**: this is not your Apify account password. | :::caution External connections If you use these connection parameters for connecting to Apify Proxy from your Actors running on the Apify platform, the connection will still be considered external, it will not work on the Free plan, and on paid plans you will be charged for external data transfer. Please use the connection parameters from the [Connection from Actors](#connection-from-actors) section when using Apify Proxy from Actors. @@ -52,12 +52,12 @@ If you want to connect to Apify Proxy from Actors running on the Apify platform, If you don't want to use these helpers, and want to connect to Apify Proxy manually, you can find the right configuration values in [environment variables](../actors/development/programming_interface/environment_variables.md) provided to the Actor. By using this configuration, you ensure that you connect to Apify Proxy directly through the Apify infrastructure, bypassing any external connection via the Internet, thereby improving the connection speed, and ensuring you don't pay for external data transfer. -| Parameter | Source / explanation | -|---------------------|---------------------| -| Hostname | `APIFY_PROXY_HOSTNAME` environment variable | -| Port | `APIFY_PROXY_PORT` environment variable | -| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username.| -| Password | `APIFY_PROXY_PASSWORD` environment variable | +| Parameter | Source / explanation | +| :--- | :--- | +| Hostname | `APIFY_PROXY_HOSTNAME` environment variable | +| Port | `APIFY_PROXY_PORT` environment variable | +| Username | Specifies the proxy parameters such as groups, [session](#sessions) and location. See [username parameters](#username-parameters) below for details.
**Note**: this is not your Apify username. | +| Password | `APIFY_PROXY_PASSWORD` environment variable | Example connection string creation: