Compare commits

...

311 Commits

Author SHA1 Message Date
Aarushi fe84cbe566
Revert "feature(backend): Add ability to execute store agents without agent ownership" (#9263)
Reverts Significant-Gravitas/AutoGPT#9179

This PR is preventing the running of agents in dev.
2025-01-13 18:34:17 +00:00
Aarushi 5618072375
fix(blocks/Exa): Fix exa contents block advanced toggle (#9255)
Toggling the advanced option on Exa Contents Block isn't working. It
throws a frontend error.

### Changes 🏗️

Remove Optional from ContentRetrievalSettings in exa/contents.py

### Checklist 📋

#### For code changes:
- [x ] I have clearly listed my changes in the PR description
- [ x] I have made a test plan
- [ x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - Add an ExaContentsBlock
  - Hit advanced


#### For configuration changes:
N/A
2025-01-13 16:35:14 +00:00
Reinier van der Leer 95b79abcfe
Revert broken Library v2 DB stuff of #9218, #9211 (#9256)
- **Revert "feature(platform): Implement library add, update, remove,
archive functionality (#9218)"**
- **Revert "feat(backend): Add Support for Managing Agent Presets with
Pagination and Soft Delete (#9211)"**

These PRs contain untested changes to DB functions and cause issues in
production.
2025-01-13 16:08:58 +01:00
Swifty fd6f28fa57
feature(platform): Implement library add, update, remove, archive functionality (#9218)
### Changes 🏗️

1. **Core Features**:
   - Add agents to the user's library.
   - Update library agents (auto-update, favorite, archive, delete).
   - Paginate library agents and presets.
   - Execute graphs using presets.

2. **Refactoring**:
   - Replaced `UserAgent` with `LibraryAgent`.
   - Separated routes for agents and presets.

3. **Schema Changes**:
- Added `LibraryAgent` table with fields like `isArchived`, `isDeleted`,
etc.
   - Soft delete functionality for `AgentPreset`.

4. **Testing**:
   - Updated tests for `LibraryAgent` operations.
   - Added edge case tests for deletion, archiving, and pagination.

5. **Database Migrations**:
   - Migration to drop `UserAgent` and add `LibraryAgent`.
   - Added fields for soft deletion and auto-update.


Note this includes the changes from the following PR's to avoid merge
conflicts with them:

#9179 
#9211

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-01-10 13:02:53 +01:00
Swifty 4b17cc9963
feat(backend): Add Support for Managing Agent Presets with Pagination and Soft Delete (#9211)
#### Summary
- **New Models**: Added `LibraryAgentPreset`,
`LibraryAgentPresetResponse`, `Pagination`, and
`CreateLibraryAgentPresetRequest`.
- **Database Changes**:
  - Added `isDeleted` column in `AgentPreset` for soft delete.
  - CRUD operations for `AgentPreset`:
    - `get_presets` with pagination.
    - `get_preset` by ID.
    - `create_or_update_preset` for upsert.
    - `delete_preset` to soft delete.
- **API Routes**:
  - `GET /presets`: Fetch paginated presets.
  - `GET /presets/{preset_id}`: Fetch a single preset.
  - `POST /presets`: Create a preset.
  - `PUT /presets/{preset_id}`: Update a preset.
  - `DELETE /presets/{preset_id}`: Soft delete a preset.
- **Tests**:
  - Coverage for models, CRUD operations, and pagination.
- **Migration**:
  - Added `isDeleted` field to support soft delete.

#### Review Notes
- Validate migration scripts and test coverage.
- Ensure API aligns with project standards.

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-01-10 12:57:35 +01:00
Swifty 00bb7c67b3
feature(backend): Add ability to execute store agents without agent ownership (#9179)
### Description

This PR enables the execution of store agents even if they are not owned
by the user. Key changes include handling store-listed agents in the
`get_graph` logic, improving execution flow, and ensuring
version-specific handling. These updates support more flexible agent
execution.

### Changes 🏗️

- **Graph Retrieval:** Updated `get_graph` to check store listings for
agents not owned by the user.
- **Version Handling:** Added `graph_version` to execution methods for
consistent version-specific execution.
- **Execution Flow:** Refactored `scheduler.py`, `rest_api.py`, and
other modules for clearer logic and better maintainability.
- **Testing:** Updated `test_manager.py` and other test cases to
validate execution of store-listed agents added test for accessing graph

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-10 12:39:06 +01:00
Zamil Majdy 9d1bc25ffa
hotfix(backend): Increase statement timeout for the double brace migration (#9245)
### Changes 🏗️


https://github.com/Significant-Gravitas/AutoGPT/actions/runs/12696734339/job/35391431786

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 21:59:22 +00:00
Zamil Majdy 3a3ee994c2
hotfix(backend): Increase statement timeout for the double brace migration (#9244)
### Changes 🏗️


https://github.com/Significant-Gravitas/AutoGPT/actions/runs/12696734339/job/35391431786

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 19:44:37 +00:00
Aarushi 0d44f5be13
feat(backend/blocks/nvidia): Provide Nvidia by default (#9235)
We want to allow users to use Nvidia without their own keys

### Changes 🏗️

Added nvidia api key to credentials store.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 18:12:05 +00:00
Zamil Majdy 1670579a61
fix(block): Remove Python.format & Jinja templating format backward compatibility (#9229)
Python format uses `{Variable}` as the variable placeholder, while Jinja
uses `{{Variable}}` as its default.
Jinja is used as the main templating engine on the system, but the
Python format version is still maintained for backward compatibility.

However, the backward compatibility support can cause a side effect
while passing JSON string value into the block that uses it:
https://github.com/Significant-Gravitas/AutoGPT/issues/9194

### Changes 🏗️

* Use `{{Variable}}` place holder format and removed `{Variable}`
support in these blocks:
 - '363ae599-353e-4804-937e-b2ee3cef3da4', -- AgentOutputBlock
 - 'db7d8f02-2f44-4c55-ab7a-eae0941f0c30', -- FillTextTemplateBlock
 - '1f292d4a-41a4-4977-9684-7c8d560b9f91', -- AITextGeneratorBlock
- 'ed55ac19-356e-4243-a6cb-bc599e9b716f' --
AIStructuredResponseGeneratorBlock
* Add Jinja templating support on `AITextGeneratorBlock` &
`AIStructuredResponseGeneratorBlock`
* Migrated the existing database content to prevent breaking changes.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 16:29:16 +00:00
Bently a1889e6212
docs(Ollama): Update Ollama docs (#9234)
The Ollama docs where very out of date and needed updating so I have
updated them and added some screenshots so its easier to follow.

I have also added a new Ollama model to the platform, "llama3.2" as that
is what i based the tutorial off and its name is easy to find in the
list of models

I also added a new folder in the "imgs" dir to store the Ollama related
photo just to keep things tidy
2025-01-09 15:19:37 +00:00
Abhimanyu Yadav 9c702516fd
feat(platform): fix carousel on store page (#9230)
- resolves #8973 

Adding smooth scrolling and solving some weird interaction on carousal

### Changes

- Update `CarouselPrevious`, `CarouselPrevious` and add
`CarouselIndicator` in `carousel.tsx`
- Add `CarouselPrevious`, `CarouselPrevious` and `CarouselIndicator`
support in `FeaturedSection.tsx`

### Demo 


https://github.com/user-attachments/assets/ba9a22fa-ddf2-469f-ba8a-aee1a7fc5f78
2025-01-09 13:48:53 +00:00
Aarushi 32c908ae13
fix(backend): Add default credentials for Fal, Exa, E2B (#9233)
We want to provide certain providers by default on our platform. These
three were not added previously, so fixing that.

### Changes 🏗️

If api keys for Fal Exa or E2B exist in environment variables, load them
by default as credentials that are usable by our users.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 13:47:54 +00:00
Abhimanyu Yadav b4a0100c22
feat(platform): Add Twitter integration (#8754)
- Resolves #8326  

Create a Twitter integration with some small frontend changes.

### Changes
1. Add Twitter OAuth 2.0 with PKCE support for authentication.
2. Add a way to multi-select from a list of enums by creating a
multi-select on the frontend.
3. Add blocks for Twitter integration.
4. `_types.py` for repetitive enums and input types.
5. `_builders.py` for creating parameters without repeating the same
logic.
6. `_serializer.py` to serialize the Tweepy enums into dictionaries so
they can travel easily from Pyro5.
7. `_mappers.py` to map the frontend values to the correct request
values.

> I have added a new multi-select feature because my list contains many
items, and selecting all of them makes the block cluttered. This new
block displays only the first two items and then show something like "2
more" . It works only for list of enums.


### Blocks

Block Name | What It Does | Error Reason | Manual Testing
-- | -- | -- | --
`TwitterBookmarkTweetBlock` | Bookmark a tweet on Twitter | No error | 
`TwitterGetBookmarkedTweetsBlock` | Get all your bookmarked tweets from
Twitter | No error | 
`TwitterRemoveBookmarkTweetBlock` | Remove a bookmark for a tweet on
Twitter | No error | 
`TwitterHideReplyBlock` | Hides a reply of one of your tweets | No error
| 
`TwitterUnhideReplyBlock` | Unhides a reply to a tweet | No error | 
`TwitterLikeTweetBlock` | Likes a tweet | No error | 
`TwitterGetLikingUsersBlock` | Gets information about users who liked
one of your tweets | No error | 
`TwitterGetLikedTweetsBlock` | Gets information about tweets liked by
you | No error | 
`TwitterUnlikeTweetBlock` | Unlikes a tweet that was previously liked |
No error | 
`TwitterPostTweetBlock` | Create a tweet on Twitter with the option to
include one additional element such as media, quote, or deep link. | No
error | 
`TwitterDeleteTweetBlock` | Deletes a tweet on Twitter using Twitter ID
| No error | 
`TwitterSearchRecentTweetsBlock` | Searches all public Tweets in Twitter
history | No error | 
`TwitterGetQuoteTweetsBlock` | Gets quote tweets for a specified tweet
ID | No error | 
`TwitterRetweetBlock` | Retweets a tweet on Twitter | No error | 
`TwitterRemoveRetweetBlock` | Removes a retweet on Twitter | No error |

`TwitterGetRetweetersBlock` | Gets information about who has retweeted a
tweet | No error | 
`TwitterGetUserMentionsBlock` | Returns Tweets where a single user is
mentioned, just put that user ID | No error | 
`TwitterGetHomeTimelineBlock` | Returns a collection of the most recent
Tweets and Retweets posted by you and users you follow | No error | 
`TwitterGetUserTweetsBlock` | Returns Tweets composed by a single user,
specified by the requested user ID | No error | 
`TwitterGetTweetBlock` | Returns information about a single Tweet
specified by the requested ID | No error | 
`TwitterGetTweetsBlock` | Returns information about multiple Tweets
specified by the requested IDs | No error | 
`TwitterUnblockUserBlock` | Unblock a specific user on Twitter | No
error | 
`TwitterGetBlockedUsersBlock` | Get a list of users who are blocked by
the authenticating user | No error | 
`TwitterBlockUserBlock` | Block a specific user on Twitter | No error |

`TwitterUnfollowUserBlock` | Allows a user to unfollow another user
specified by target user ID | No error | 
`TwitterFollowUserBlock` | Allows a user to follow another user
specified by target user ID | No error | 
`TwitterGetFollowersBlock` | Retrieves a list of followers for a
specified Twitter user ID | Need Enterprise level access | 
`TwitterGetFollowingBlock` | Retrieves a list of users that a specified
Twitter user ID is following | Need Enterprise level access | 
`TwitterUnmuteUserBlock` | Allows a user to unmute another user
specified by target user ID | No error | 
`TwitterGetMutedUsersBlock` | Returns a list of users who are muted by
the authenticating user | No error | 
`TwitterMuteUserBlock` | Allows a user to mute another user specified by
target user ID | No error | 
`TwitterGetUserBlock` | Gets information about a single Twitter user
specified by ID or username | No error | 
`TwitterGetUsersBlock` | Gets information about multiple Twitter users
specified by IDs or usernames | No error | 
`TwitterSearchSpacesBlock` | Returns live or scheduled Spaces matching
specified search terms [for a week only] | No error | 
`TwitterGetSpacesBlock` | Gets information about multiple Twitter Spaces
specified by Space IDs or creator user IDs | No error | 
`TwitterGetSpaceByIdBlock` | Gets information about a single Twitter
Space specified by Space ID | No error | 
`TwitterGetSpaceBuyersBlock` | Gets list of users who purchased a ticket
to the requested Space | I do not have a monetized account for this | 
`TwitterGetSpaceTweetsBlock` | Gets list of Tweets shared in the
requested Space | No error | 
`TwitterUnfollowListBlock` | Unfollows a Twitter list for the
authenticated user | No error | 
`TwitterFollowListBlock` | Follows a Twitter list for the authenticated
user | No error | 
`TwitterListGetFollowersBlock` | Gets followers of a specified Twitter
list | Enterprise level access | 
`TwitterGetFollowedListsBlock` | Gets lists followed by a specified
Twitter user | Enterprise level access | 
`TwitterGetListBlock` | Gets information about a Twitter List specified
by ID | No error | 
`TwitterGetOwnedListsBlock` | Gets all Lists owned by the specified user
| No error | 
`TwitterRemoveListMemberBlock` | Removes a member from a Twitter List
that the authenticated user owns | No error | 
`TwitterAddListMemberBlock` | Adds a member to a Twitter List that the
authenticated user owns | No error | 
`TwitterGetListMembersBlock` | Gets the members of a specified Twitter
List | No error | 
`TwitterGetListMembershipsBlock` | Gets all Lists that a specified user
is a member of | No error | 
`TwitterGetListTweetsBlock` | Gets tweets from a specified Twitter list
| No error | 
`TwitterDeleteListBlock` | Deletes a Twitter List owned by the
authenticated user | No error | 
`TwitterUpdateListBlock` | Updates a Twitter List owned by the
authenticated user | No error | 
`TwitterCreateListBlock` | Creates a Twitter List owned by the
authenticated user | No error | 
`TwitterUnpinListBlock` | Enables the authenticated user to unpin a
List. | No error | 
`TwitterPinListBlock` | Enables the authenticated user to pin a List. |
No error | 
`TwitterGetPinnedListsBlock` | Returns the Lists pinned by the
authenticated user. | No error | 
`TwitterGetDMEventsBlock` | Gets a list of Direct Message events for the
authenticated user | Need Enterprise level access | 
`TwitterSendDirectMessageBlock` | Sends a direct message to a Twitter
user | Need Enterprise level access | 
`TwitterCreateDMConversationBlock` | Creates a new group direct message
| Need Enterprise level access | 

### Need to add more stuff
1. A normal input to select date and time.
2. Some more enterprise-level blocks, especially webhook triggers.

Supported triggers 


Event Name | Description
-- | --
Posts (by user) | User creates a new post.
Post deletes (by user) | User deletes an existing post.
@mentions (of user) | User is mentioned in a post.
Replies (to or from user) | User replies to a post or receives a reply
from another user.
Retweets (by user or of user) | User retweets a post or someone retweets
the user's post.
Quote Tweets (by user or of user) | User quote tweets a post or someone
quote tweets the user's post.
Retweets of Quoted Tweets (by user or of user) | Retweets of quote
tweets by the user or of the user.
Likes (by user or of user) | User likes a post or someone likes the
user's post.
Follows (by user or of user) | User follows another user or another user
follows the user.
Unfollows (by user) | User unfollows another user.
Blocks (by user) | User blocks another user.
Unblocks (by user) | User unblocks a previously blocked user.
Mutes (by user) | User mutes another user.
Unmutes (by user) | User unmutes a previously muted user.
Direct Messages sent (by user) | User sends direct messages to other
users.
Direct Messages received (by user) | User receives direct messages from
other users.
Typing indicators (to user) | Indicators showing when someone is typing
a message to the user.
Read receipts (to user) | Indicators showing when the user has read a
message.
Subscription revokes (by user) | User revokes a subscription to a
service or content.

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
2025-01-08 19:47:00 +00:00
Bently e4d8502729
fix(blocks): improve handling of plain text in send web request block (#9219)
### Changes 🏗️
This is to improve how we deal with plain text in the send web request
block.

-	Plain text stays as plain text (regardless of JSON toggle)
-	Valid JSON with JSON toggle enabled sends as JSON
-	JSON-like data with JSON toggle disabled sends as form data
2025-01-08 12:18:42 +00:00
Abhimanyu Yadav 43a79d063f
feat(platform) : Add api key generation frontend (#9212)
Allow users to create API keys for the AutoGPT platform. The backend is
already set up, and here I’ve added the frontend for it.

### Changes
1. Fix the `response-model` of the API keys endpoints.  
2. Add a new page `/store/api_keys`.  
3. Add an `APIKeySection` component to create, delete, and view all your
API keys.

<img width="1512" alt="Screenshot 2025-01-07 at 3 59 25 PM"
src="https://github.com/user-attachments/assets/ea4e9d35-eb92-4e10-a4fb-1fc51dfe11bb"
/>
2025-01-08 11:13:08 +00:00
Abhimanyu Yadav 7ec9830b02
fix(platform): Add custom fonts and update layout styles (#9195)
- resolve #9187 
### Changes 🏗️

Add support for `Inter`, `Poppins`, `Geist-Mono`, `Geist-Neue`, and
`Neue` in `layout.tsx` and `tailwind.config.ts`.

<img width="844" alt="Screenshot 2025-01-06 at 10 59 35 AM"
src="https://github.com/user-attachments/assets/5e93e8a3-cda1-4d01-ba5d-7027a8c1dea7"
/>

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-01-08 09:44:35 +00:00
Aarushi b558ccae0b
feat(blocks/nvidia): Add Nvidia deepfake detection block (#9213)
Adding a block to allow users to detect deepfakes in their workflows. 
This block takes in an image as input and returns the probability of it
being a deepfake as well as the bounding boxes around the image.

### Changes 🏗️

- Added NvidiaDeepfakeDetectBlock
- Added the ability to upload images on the frontend
- Added the ability to render base64 encoded images on the frontend
<img width="1001" alt="Screenshot 2025-01-07 at 2 16 42 PM"
src="https://github.com/user-attachments/assets/c3d090f3-3981-4235-a66b-f8e2a3920a4d"
/>

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-07 22:41:28 +00:00
Nicholas Tindle 4115f65223
Fix Provider name enum being used instead of value (#9216)
<!-- Clearly explain the need for these changes: -->

Webhooks are broken

### Changes 🏗️
Swaps the way we fill webhooks into strings
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Manually test creating a webhook with Github and Compass
2025-01-07 20:18:43 +00:00
dependabot[bot] 7defba8d24
chore(frontend/deps-dev): bump the development-dependencies group in /autogpt_platform/frontend with 4 updates (#9207)
Bumps the development-dependencies group in /autogpt_platform/frontend
with 4 updates:
[@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node),
[chromatic](https://github.com/chromaui/chromatic-cli),
[concurrently](https://github.com/open-cli-tools/concurrently) and
[eslint-plugin-storybook](https://github.com/storybookjs/eslint-plugin-storybook).

Updates `@types/node` from 22.10.2 to 22.10.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node">compare
view</a></li>
</ul>
</details>
<br />

Updates `chromatic` from 11.20.2 to 11.22.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/chromatic-cli/releases">chromatic's
releases</a>.</em></p>
<blockquote>
<h2>v11.22.0</h2>
<h4>🚀 Enhancement</h4>
<ul>
<li>Bail on preview file changes <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1133">#1133</a>
(<a href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Cody Kaup (<a
href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h2>v11.21.0</h2>
<h4>🚀 Enhancement</h4>
<ul>
<li>Set <code>storybookUrl</code> action output on rebuild early exit <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1134">#1134</a>
(<a href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
<li>Upload coverage reports to codecov <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1132">#1132</a>
(<a
href="https://github.com/paulelliott"><code>@​paulelliott</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>John Hobbs (<a
href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
<li>Paul Elliott (<a
href="https://github.com/paulelliott"><code>@​paulelliott</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/chromatic-cli/blob/main/CHANGELOG.md">chromatic's
changelog</a>.</em></p>
<blockquote>
<h1>v11.22.0 (Fri Jan 03 2025)</h1>
<h4>🚀 Enhancement</h4>
<ul>
<li>Bail on preview file changes <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1133">#1133</a>
(<a href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Cody Kaup (<a
href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<hr />
<h1>v11.21.0 (Fri Jan 03 2025)</h1>
<h4>🚀 Enhancement</h4>
<ul>
<li>Set <code>storybookUrl</code> action output on rebuild early exit <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1134">#1134</a>
(<a href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
<li>Upload coverage reports to codecov <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1132">#1132</a>
(<a
href="https://github.com/paulelliott"><code>@​paulelliott</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>John Hobbs (<a
href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
<li>Paul Elliott (<a
href="https://github.com/paulelliott"><code>@​paulelliott</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8f27f8b046"><code>8f27f8b</code></a>
Bump version to: 11.22.0 [skip ci]</li>
<li><a
href="0c209609f3"><code>0c20960</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="43a9b94828"><code>43a9b94</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/chromatic-cli/issues/1133">#1133</a>
from chromaui/cody/cap-2599-turbosnap-exit-on-storyb...</li>
<li><a
href="730a7aa0d3"><code>730a7aa</code></a>
Bump version to: 11.21.0 [skip ci]</li>
<li><a
href="0afebf6ad9"><code>0afebf6</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="f729a2a4ee"><code>f729a2a</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/chromatic-cli/issues/1134">#1134</a>
from chromaui/jmhobbs/cap-2317-chromauiaction-skippi...</li>
<li><a
href="05bcf30274"><code>05bcf30</code></a>
Set storybookUrl action output on rebuild abort</li>
<li><a
href="300222ff7f"><code>300222f</code></a>
Bail on preview file changes</li>
<li><a
href="9dbaef7d89"><code>9dbaef7</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/chromatic-cli/issues/1132">#1132</a>
from chromaui/paulelliott/set-up-codecov</li>
<li><a
href="851574b606"><code>851574b</code></a>
Run lint-and-test workflow on pushes to main</li>
<li>Additional commits viewable in <a
href="https://github.com/chromaui/chromatic-cli/compare/v11.20.2...v11.22.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `concurrently` from 9.1.1 to 9.1.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/open-cli-tools/concurrently/releases">concurrently's
releases</a>.</em></p>
<blockquote>
<h2>v9.1.2</h2>
<h2>What's Changed</h2>
<ul>
<li>Add ability to have custom logger by <a
href="https://github.com/mwood23"><code>@​mwood23</code></a> in <a
href="https://redirect.github.com/open-cli-tools/concurrently/pull/522">open-cli-tools/concurrently#522</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/mwood23"><code>@​mwood23</code></a> made
their first contribution in <a
href="https://redirect.github.com/open-cli-tools/concurrently/pull/522">open-cli-tools/concurrently#522</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/open-cli-tools/concurrently/compare/v9.1.1...v9.1.2">https://github.com/open-cli-tools/concurrently/compare/v9.1.1...v9.1.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7f3efb201b"><code>7f3efb2</code></a>
9.1.2</li>
<li><a
href="36eccae46c"><code>36eccae</code></a>
Add ability to have custom logger (<a
href="https://redirect.github.com/open-cli-tools/concurrently/issues/522">#522</a>)</li>
<li>See full diff in <a
href="https://github.com/open-cli-tools/concurrently/compare/v9.1.1...v9.1.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `eslint-plugin-storybook` from 0.11.1 to 0.11.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/eslint-plugin-storybook/releases">eslint-plugin-storybook's
releases</a>.</em></p>
<blockquote>
<h2>v0.11.2</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>fix(peer-deps): update eslint version range to <code>&gt;=8</code>
<a
href="https://redirect.github.com/storybookjs/eslint-plugin-storybook/pull/186">#186</a>
(<a href="https://github.com/zacowan"><code>@​zacowan</code></a> <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Zachary Cowan (<a
href="https://github.com/zacowan"><code>@​zacowan</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/eslint-plugin-storybook/blob/main/CHANGELOG.md">eslint-plugin-storybook's
changelog</a>.</em></p>
<blockquote>
<h1>v0.11.2 (Thu Jan 02 2025)</h1>
<h4>🐛 Bug Fix</h4>
<ul>
<li>fix(peer-deps): update eslint version range to <code>&gt;=8</code>
<a
href="https://redirect.github.com/storybookjs/eslint-plugin-storybook/pull/186">#186</a>
(<a href="https://github.com/zacowan"><code>@​zacowan</code></a> <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Zachary Cowan (<a
href="https://github.com/zacowan"><code>@​zacowan</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d8acf1fbc4"><code>d8acf1f</code></a>
Bump version to: 0.11.2 [skip ci]</li>
<li><a
href="6ff93f31ac"><code>6ff93f3</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="633e59828f"><code>633e598</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/eslint-plugin-storybook/issues/186">#186</a>
from zacowan/zacowan-patch-1</li>
<li><a
href="9351188fc3"><code>9351188</code></a>
docs: add compatibility table</li>
<li><a
href="6906363c56"><code>6906363</code></a>
fix(deps): use accurate eslint peer</li>
<li>See full diff in <a
href="https://github.com/storybookjs/eslint-plugin-storybook/compare/v0.11.1...v0.11.2">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-07 18:42:51 +00:00
dependabot[bot] f5afdcc650
chore(frontend/deps): bump the production-dependencies group across 1 directory with 11 updates (#9214)
Bumps the production-dependencies group with 11 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
| [@hookform/resolvers](https://github.com/react-hook-form/resolvers) |
`3.9.1` | `3.10.0` |
|
[@next/third-parties](https://github.com/vercel/next.js/tree/HEAD/packages/third-parties)
| `15.1.0` | `15.1.3` |
| [@sentry/nextjs](https://github.com/getsentry/sentry-javascript) |
`8.45.1` | `8.48.0` |
| [framer-motion](https://github.com/motiondivision/motion) | `11.15.0`
| `11.16.0` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.468.0` | `0.469.0` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.4.4`
| `9.5.0` |
| [react-hook-form](https://github.com/react-hook-form/react-hook-form)
| `7.54.1` | `7.54.2` |
| [react-markdown](https://github.com/remarkjs/react-markdown) | `9.0.1`
| `9.0.3` |
| [react-modal](https://github.com/reactjs/react-modal) | `3.16.1` |
`3.16.3` |
| [tailwind-merge](https://github.com/dcastil/tailwind-merge) | `2.5.5`
| `2.6.0` |
| [uuid](https://github.com/uuidjs/uuid) | `11.0.3` | `11.0.4` |


Updates `@hookform/resolvers` from 3.9.1 to 3.10.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/react-hook-form/resolvers/releases"><code>@​hookform/resolvers</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v3.10.0</h2>
<h1><a
href="https://github.com/react-hook-form/resolvers/compare/v3.9.1...v3.10.0">3.10.0</a>
(2025-01-06)</h1>
<h3>Features</h3>
<ul>
<li>update to effect 3.10 (<a
href="https://redirect.github.com/react-hook-form/resolvers/issues/729">#729</a>)
(<a
href="10aca41229">10aca41</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="10aca41229"><code>10aca41</code></a>
feat: update to effect 3.10 (<a
href="https://redirect.github.com/react-hook-form/resolvers/issues/729">#729</a>)</li>
<li><a
href="e523dde4d9"><code>e523dde</code></a>
chore: update to effet 3.10 (<a
href="https://redirect.github.com/react-hook-form/resolvers/issues/720">#720</a>)</li>
<li>See full diff in <a
href="https://github.com/react-hook-form/resolvers/compare/v3.9.1...v3.10.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@next/third-parties` from 15.1.0 to 15.1.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases"><code>@​next/third-parties</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v15.1.3</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>Retry manifest file loading only in dev mode: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73900">#73900</a></li>
<li>Use shared worker for lint &amp; typecheck steps: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74154">#74154</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/unstubbable"><code>@​unstubbable</code></a> and
<a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.2</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>Update React from 7283a213-20241206 to 65e06cb7-20241218: <a
href="https://redirect.github.com/vercel/next.js/pull/74117">vercel/next.js#74117</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.1</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>fix(turbo): sassOptions silenceDeprecations was not overwritten with
user options: <a
href="https://redirect.github.com/vercel/next.js/pull/73937">vercel/next.js#73937</a></li>
<li>refactor collectAppPageSegments: <a
href="https://redirect.github.com/vercel/next.js/pull/73908">vercel/next.js#73908</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/devjiwonchoi"><code>@​devjiwonchoi</code></a>
and <a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.1-canary.26</h2>
<h3>Core Changes</h3>
<ul>
<li>Upgrade React from <code>518d06d2-20241219</code> to
<code>3b009b4c-20250102</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74492">#74492</a></li>
<li>fix: add node internals stack frames to ignored list: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73698">#73698</a></li>
<li>chore: break calls to forEach into for loops: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74523">#74523</a></li>
<li>[DevOverlay] Add error message: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74541">#74541</a></li>
<li>[DevOverlay] Add error type label: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74543">#74543</a></li>
<li>feat: connect error rating buttons to telemetry API: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74496">#74496</a></li>
<li>[metadata] Move metadata rendering adjacent to page component: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74262">#74262</a></li>
<li>Delete set-cache-busting-search-param.test.ts: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74561">#74561</a></li>
<li>fix: enhance a11y, prevent double firing in error rating: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74563">#74563</a></li>
<li>fix: add aria-hidden to error overlay voting icons: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74568">#74568</a></li>
</ul>
<h3>Misc Changes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4cbaaa118d"><code>4cbaaa1</code></a>
v15.1.3</li>
<li><a
href="df392a1b97"><code>df392a1</code></a>
v15.1.2</li>
<li><a
href="4384c6834a"><code>4384c68</code></a>
v15.1.1</li>
<li>See full diff in <a
href="https://github.com/vercel/next.js/commits/v15.1.3/packages/third-parties">compare
view</a></li>
</ul>
</details>
<br />

Updates `@sentry/nextjs` from 8.45.1 to 8.48.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/releases"><code>@​sentry/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>8.48.0</h2>
<h3>Deprecations</h3>
<ul>
<li>
<p><strong>feat(v8/core): Deprecate <code>getDomElement</code> method
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14799">#14799</a>)</strong></p>
<p>Deprecates <code>getDomElement</code>. There is no replacement.</p>
</li>
</ul>
<h3>Other changes</h3>
<ul>
<li>fix(nestjs/v8): Use correct main/module path in package.json (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14791">#14791</a>)</li>
<li>fix(v8/core): Use consistent <code>continueTrace</code>
implementation in core (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14819">#14819</a>)</li>
<li>fix(v8/node): Correctly resolve debug IDs for ANR events with custom
appRoot (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14823">#14823</a>)</li>
<li>fix(v8/node): Ensure <code>NODE_OPTIONS</code> is not passed to
worker threads (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14825">#14825</a>)</li>
<li>fix(v8/angular): Fall back to element <code>tagName</code> when name
is not provided to <code>TraceDirective</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14828">#14828</a>)</li>
<li>fix(aws-lambda): Remove version suffix from lambda layer (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14843">#14843</a>)</li>
<li>fix(v8/node): Ensure express requests are properly handled (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14851">#14851</a>)</li>
<li>feat(v8/node): Add <code>openTelemetrySpanProcessors</code> option
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14853">#14853</a>)</li>
<li>fix(v8/react): Use <code>Set</code> as the <code>allRoutes</code>
container. (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14878">#14878</a>)
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14884">#14884</a>)</li>
<li>fix(v8/react): Improve handling of routes nested under
path=&quot;/&quot; (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14897">#14897</a>)</li>
<li>feat(v8/core): Add <code>normalizedRequest</code> to
<code>samplingContext</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14903">#14903</a>)</li>
<li>fix(v8/feedback): Avoid lazy loading code for
<code>syncFeedbackIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14918">#14918</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/arturovt"><code>@​arturovt</code></a>. Thank
you for your contribution!</p>
<h2>Bundle size 📦</h2>
<table>
<thead>
<tr>
<th>Path</th>
<th>Size</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>@​sentry/browser</code></td>
<td>23.29 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> - with treeshaking flags</td>
<td>21.96 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing)</td>
<td>35.85 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay)</td>
<td>73.09 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay) - with
treeshaking flags</td>
<td>63.48 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay with
Canvas)</td>
<td>77.4 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay, Feedback)</td>
<td>89.34 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Feedback)</td>
<td>39.5 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. sendFeedback)</td>
<td>27.89 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. FeedbackAsync)</td>
<td>32.69 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code></td>
<td>25.96 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code> (incl. Tracing)</td>
<td>38.66 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code></td>
<td>27.56 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code> (incl. Tracing)</td>
<td>37.69 KB</td>
</tr>
<tr>
<td><code>@​sentry/svelte</code></td>
<td>23.45 KB</td>
</tr>
<tr>
<td>CDN Bundle</td>
<td>24.49 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing)</td>
<td>37.56 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay)</td>
<td>72.75 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback)</td>
<td>78.11 KB</td>
</tr>
<tr>
<td>CDN Bundle - uncompressed</td>
<td>71.93 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing) - uncompressed</td>
<td>111.42 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay) - uncompressed</td>
<td>225.5 KB</td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/blob/8.48.0/CHANGELOG.md"><code>@​sentry/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.48.0</h2>
<h3>Deprecations</h3>
<ul>
<li>
<p><strong>feat(v8/core): Deprecate <code>getDomElement</code> method
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14799">#14799</a>)</strong></p>
<p>Deprecates <code>getDomElement</code>. There is no replacement.</p>
</li>
</ul>
<h3>Other changes</h3>
<ul>
<li>fix(nestjs/v8): Use correct main/module path in package.json (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14791">#14791</a>)</li>
<li>fix(v8/core): Use consistent <code>continueTrace</code>
implementation in core (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14819">#14819</a>)</li>
<li>fix(v8/node): Correctly resolve debug IDs for ANR events with custom
appRoot (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14823">#14823</a>)</li>
<li>fix(v8/node): Ensure <code>NODE_OPTIONS</code> is not passed to
worker threads (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14825">#14825</a>)</li>
<li>fix(v8/angular): Fall back to element <code>tagName</code> when name
is not provided to <code>TraceDirective</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14828">#14828</a>)</li>
<li>fix(aws-lambda): Remove version suffix from lambda layer (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14843">#14843</a>)</li>
<li>fix(v8/node): Ensure express requests are properly handled (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14851">#14851</a>)</li>
<li>feat(v8/node): Add <code>openTelemetrySpanProcessors</code> option
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14853">#14853</a>)</li>
<li>fix(v8/react): Use <code>Set</code> as the <code>allRoutes</code>
container. (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14878">#14878</a>)
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14884">#14884</a>)</li>
<li>fix(v8/react): Improve handling of routes nested under
path=&quot;/&quot; (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14897">#14897</a>)</li>
<li>feat(v8/core): Add <code>normalizedRequest</code> to
<code>samplingContext</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14903">#14903</a>)</li>
<li>fix(v8/feedback): Avoid lazy loading code for
<code>syncFeedbackIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14918">#14918</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/arturovt"><code>@​arturovt</code></a>. Thank
you for your contribution!</p>
<h2>8.47.0</h2>
<ul>
<li>feat(v8/core): Add <code>updateSpanName</code> helper function (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14736">#14736</a>)</li>
<li>feat(v8/node): Do not overwrite prisma <code>db.system</code> in
newer Prisma versions (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14772">#14772</a>)</li>
<li>feat(v8/node/deps): Bump <code>@​prisma/instrumentation</code> from
5.19.1 to 5.22.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14755">#14755</a>)</li>
<li>feat(v8/replay): Mask srcdoc iframe contents per default (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14779">#14779</a>)</li>
<li>ref(v8/nextjs): Fix typo in source maps deletion warning (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14776">#14776</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/aloisklink"><code>@​aloisklink</code></a> and
<a href="https://github.com/benjick"><code>@​benjick</code></a>. Thank
you for your contributions!</p>
<h2>8.46.0</h2>
<ul>
<li>feat: Allow capture of more than 1 ANR event [v8] (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14713">#14713</a>)</li>
<li>feat(node): Detect Railway release name [v8] (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14714">#14714</a>)</li>
<li>fix: Normalise ANR debug image file paths if appRoot was supplied
[v8] (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14709">#14709</a>)</li>
<li>fix(nuxt): Remove build config from tsconfig (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14737">#14737</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/conor-ob"><code>@​conor-ob</code></a>. Thank
you for your contribution!</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="405ceb4a4d"><code>405ceb4</code></a>
release: 8.48.0</li>
<li><a
href="8e2ed6f82a"><code>8e2ed6f</code></a>
meta(changelog): Update changelog for 8.48.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14919">#14919</a>)</li>
<li><a
href="8b03e0b421"><code>8b03e0b</code></a>
fix(v8/feedback): Avoid lazy loading code for
<code>syncFeedbackIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14918">#14918</a>)</li>
<li><a
href="77cabfbc33"><code>77cabfb</code></a>
fix(v8/react): Use <code>Set</code> as the <code>allRoutes</code>
container. (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14878">#14878</a>)
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14884">#14884</a>)</li>
<li><a
href="6fa3797ddb"><code>6fa3797</code></a>
feat(v8/core): Add <code>normalizedRequest</code> to
<code>samplingContext</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14903">#14903</a>)</li>
<li><a
href="845b7aa2e0"><code>845b7aa</code></a>
fix(v8/react): Improve handling of routes nested under
path=&quot;/&quot; (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14897">#14897</a>)</li>
<li><a
href="dbd3296580"><code>dbd3296</code></a>
feat(v8/node): Add <code>openTelemetrySpanProcessors</code> option (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14853">#14853</a>)</li>
<li><a
href="960dd9be89"><code>960dd9b</code></a>
fix(v8/node): Ensure express requests are properly handled (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14851">#14851</a>)</li>
<li><a
href="576a1ad0f2"><code>576a1ad</code></a>
fix(v8/core): Use consistent <code>continueTrace</code> implementation
in core (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14819">#14819</a>)</li>
<li><a
href="75ca8b9c5a"><code>75ca8b9</code></a>
meta(changelog): Update changelog for 8.48.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14844">#14844</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-javascript/compare/8.45.1...8.48.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `framer-motion` from 11.15.0 to 11.16.0
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/motiondivision/motion/blob/main/CHANGELOG.md">framer-motion's
changelog</a>.</em></p>
<blockquote>
<h2>[11.16.0] 2024-01-06</h2>
<h3>Added</h3>
<ul>
<li>Added <code>view()</code> alpha to early access.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0d6f15819d"><code>0d6f158</code></a>
v11.16.0</li>
<li><a
href="60b365926c"><code>60b3659</code></a>
Updating changelog</li>
<li><a
href="d22113827f"><code>d221138</code></a>
Feature/view function (<a
href="https://redirect.github.com/motiondivision/motion/issues/2970">#2970</a>)</li>
<li><a
href="8ca78c00b6"><code>8ca78c0</code></a>
Updating readmes</li>
<li><a
href="ecd97f7dce"><code>ecd97f7</code></a>
Updating readme</li>
<li>See full diff in <a
href="https://github.com/motiondivision/motion/compare/v11.15.0...v11.16.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `lucide-react` from 0.468.0 to 0.469.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-react's
releases</a>.</em></p>
<blockquote>
<h2>New icons 0.469.0</h2>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>snowflake</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2610">#2610</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>sun-snow</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2610">#2610</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>thermometer-snowflake</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2610">#2610</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="970fc3d4be"><code>970fc3d</code></a>
fix(lucide-react): support React 19 (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2666">#2666</a>)</li>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.469.0/packages/lucide-react">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-day-picker` from 9.4.4 to 9.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gpbl/react-day-picker/releases">react-day-picker's
releases</a>.</em></p>
<blockquote>
<h2>v9.5.0</h2>
<p>This release adds full support for the <a
href="https://daypicker.dev/docs/localization#persian-calendar">Persian
calendar</a> and a new <code>numerals</code> prop to <a
href="https://daypicker.dev/docs/translation#numeral-systems">set the
numbering system</a>.</p>
<h3>Breaking Change: Dropdown Formatters</h3>
<p>The <code>formatMonthDropdown</code> and
<code>formatYearDropdown</code> now receive a <code>Date</code> (instead
of a <code>number</code>) as first argument.</p>
<pre lang="diff"><code>&lt;DayPicker formatters={{ 
- formatMonthDropdown: (month) =&gt; format(new Date(month),
&quot;mmmm&quot;) }}
+ formatMonthDropdown: (date) =&gt; format(date, &quot;mmmm&quot;) }}
/&gt;
- formatYearDropdown: (year) =&gt; format(new Date(year),
&quot;yyyy&quot;) }}
+ formatYearDropdown: (date) =&gt; format(date, &quot;yyyy&quot;) }}
/&gt;
/&gt;
</code></pre>
<h3>Persian Calendar</h3>
<p>Persian Calendar get fulls support in DayPicker and replaces the
previous &quot;Jalali Calendar&quot;.</p>
<p>If you were using DayPicker from
<code>react-day-picker/jalali</code>, change your imports to
<code>react-day-picker/persian</code>:</p>
<pre lang="diff"><code>- import { DayPicker } from
`react-day-picker/jalali`;
+ import { DayPicker } from  `react-day-picker/persian`;
</code></pre>
<p>See the <a
href="https://daypicker.dev/docs/localization#persian-calendar">Persian
calendar</a> documentation for more details about using Persian calendar
in DayPicker.</p>
<h2>What's Changed</h2>
<ul>
<li>feat: add Persian calendar support by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2645">gpbl/react-day-picker#2645</a></li>
<li>feat: add new <code>numerals</code> prop by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2647">gpbl/react-day-picker#2647</a></li>
<li>feat: add <code>today</code>, <code>newDate</code>,
<code>timeZone</code> to the <code>DateLib</code> class by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2642">gpbl/react-day-picker#2642</a></li>
<li>feat: remove <code>startMonth</code>/<code>endMonth</code>
constraints when caption layout is <code>dropdown-months</code> by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2648">gpbl/react-day-picker#2648</a></li>
<li>build: add <code>date-fns-jalali</code> to the package dependencies
by <a href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2640">gpbl/react-day-picker#2640</a></li>
<li>fix(breaking): dropdown formatters to use <code>dateLib</code>
format by <a href="https://github.com/gpbl"><code>@​gpbl</code></a> in
<a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2644">gpbl/react-day-picker#2644</a></li>
<li>fix(jalali): incorrect Jalali month names when using dropdown
layouts by <a href="https://github.com/gpbl"><code>@​gpbl</code></a> in
<a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2645">gpbl/react-day-picker#2645</a></li>
<li>fix(chore): always use <code>Date</code> constructor from
<code>dateLib</code> by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2636">gpbl/react-day-picker#2636</a></li>
<li>fix(chore): use <code>dateLib</code> for getting days/months/years
from a <code>Date</code> by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2643">gpbl/react-day-picker#2643</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.4...v9.5.0">https://github.com/gpbl/react-day-picker/compare/v9.4.4...v9.5.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a06052e71b"><code>a06052e</code></a>
fix(website): Persian test</li>
<li><a
href="aa0cacb04e"><code>aa0cacb</code></a>
fix(website): Persian formatted</li>
<li><a
href="e65d776f51"><code>e65d776</code></a>
website: update docs for Persian calendar</li>
<li><a
href="15280b7b81"><code>15280b7</code></a>
chore(persian): use <code>arabext</code> numerals as default</li>
<li><a
href="56c7acbaf1"><code>56c7acb</code></a>
chore: update tests after <a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2648">#2648</a>
(<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2650">#2650</a>)</li>
<li><a
href="bcd5215000"><code>bcd5215</code></a>
feat: remove <code>startMonth</code> and <code>endMonth</code> default
constraints when `dropdown-m...</li>
<li><a
href="1026b2ca28"><code>1026b2c</code></a>
website: playground updates</li>
<li><a
href="d09e2acdc1"><code>d09e2ac</code></a>
feat: add new <code>numerals</code> prop (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2647">#2647</a>)</li>
<li><a
href="8a67562252"><code>8a67562</code></a>
website: fix type in ShadowDomWrapper</li>
<li><a
href="5fabec63a4"><code>5fabec6</code></a>
build(website): disable typedoc watch</li>
<li>Additional commits viewable in <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.4...v9.5.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-hook-form` from 7.54.1 to 7.54.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/react-hook-form/react-hook-form/releases">react-hook-form's
releases</a>.</em></p>
<blockquote>
<h2>Version 7.54.2</h2>
<p>⚛️ fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12478">#12478</a>
issue should unregister input with controller (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12480">#12480</a>)
 close <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12443">#12443</a>
track disabled fields and only omit data on submit (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12491">#12491</a>)
⚛️ upgrade e2e automation app to react 19 (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12482">#12482</a>)
🧪 test(useWatch): destructure setValue from useForm</p>
<p>Thanks very much, <a
href="https://github.com/marcalexiei"><code>@​marcalexiei</code></a> for
your contribution to the documentation!</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ba87b7809e"><code>ba87b78</code></a>
7.54.2</li>
<li><a
href="c3d1756733"><code>c3d1756</code></a>
 close <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12443">#12443</a>
track disabled fields and only omit data on submit (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12491">#12491</a>)</li>
<li><a
href="5a961592fe"><code>5a96159</code></a>
⚛️ upgrade e2e automation app to react 19 (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12482">#12482</a>)</li>
<li><a
href="4ea65b372e"><code>4ea65b3</code></a>
⚛️ fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12478">#12478</a>
issue should unregister input with controller (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12480">#12480</a>)</li>
<li><a
href="f37465d135"><code>f37465d</code></a>
❤️ thank you very much Workleap for sponsoring the project!</li>
<li><a
href="506fa04d44"><code>506fa04</code></a>
🧪 test(useWatch): destructure setValue from useForm</li>
<li>See full diff in <a
href="https://github.com/react-hook-form/react-hook-form/compare/v7.54.1...v7.54.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-markdown` from 9.0.1 to 9.0.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/remarkjs/react-markdown/releases">react-markdown's
releases</a>.</em></p>
<blockquote>
<h2>9.0.3</h2>
<p>(same as 9.0.2 but now with d.ts files)</p>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/remarkjs/react-markdown/compare/9.0.2...9.0.3">https://github.com/remarkjs/react-markdown/compare/9.0.2...9.0.3</a></p>
<h2>9.0.2</h2>
<h4>Types</h4>
<ul>
<li>b151a90 Fix types for React 19
by <a
href="https://github.com/remcohaszing"><code>@​remcohaszing</code></a>
in <a
href="https://redirect.github.com/remarkjs/react-markdown/pull/879">remarkjs/react-markdown#879</a></li>
<li>6962af7 Add declaration maps</li>
<li>aa5933b Refactor to use <code>@import</code> to import types
by <a
href="https://github.com/remcohaszing"><code>@​remcohaszing</code></a>
in <a
href="https://redirect.github.com/remarkjs/react-markdown/pull/836">remarkjs/react-markdown#836</a></li>
</ul>
<h4>Miscellaneous</h4>
<ul>
<li>9eb589e Fix typo in changelog
by <a
href="https://github.com/NicholasWilsonDEV"><code>@​NicholasWilsonDEV</code></a>
in <a
href="https://redirect.github.com/remarkjs/react-markdown/pull/874">remarkjs/react-markdown#874</a></li>
<li>515bf19 Fix typo
by <a href="https://github.com/deep-lyra"><code>@​deep-lyra</code></a>
in <a
href="https://redirect.github.com/remarkjs/react-markdown/pull/868">remarkjs/react-markdown#868</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/remarkjs/react-markdown/compare/9.0.1...9.0.2">https://github.com/remarkjs/react-markdown/compare/9.0.1...9.0.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="aed001070a"><code>aed0010</code></a>
9.0.3</li>
<li><a
href="40c097eb6f"><code>40c097e</code></a>
9.0.2</li>
<li><a
href="2c6ffe8f93"><code>2c6ffe8</code></a>
Refactor <code>.gitignore</code></li>
<li><a
href="b664ac4459"><code>b664ac4</code></a>
Update Actions</li>
<li><a
href="e68655127b"><code>e686551</code></a>
Update dev-dependencies</li>
<li><a
href="b151a9028f"><code>b151a90</code></a>
Fix types for React 19</li>
<li><a
href="27d3949b31"><code>27d3949</code></a>
Separate all typedefs into their own JSDoc blocks (<a
href="https://redirect.github.com/remarkjs/react-markdown/issues/878">#878</a>)</li>
<li><a
href="9eb589e828"><code>9eb589e</code></a>
Fix typo in changelog</li>
<li><a
href="515bf190a0"><code>515bf19</code></a>
Fix typo</li>
<li><a
href="a7ca8edfd6"><code>a7ca8ed</code></a>
Refactor <code>.editorconfig</code></li>
<li>Additional commits viewable in <a
href="https://github.com/remarkjs/react-markdown/compare/9.0.1...9.0.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-modal` from 3.16.1 to 3.16.3
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/reactjs/react-modal/blob/master/CHANGELOG.md">react-modal's
changelog</a>.</em></p>
<blockquote>
<h2>3.16.3 - Tue, 17 Dec 2024 10:38:34 UTC</h2>
<ul>
<li><a
href="https://github.com/reactjs/react-modal/commit/a5c0cf4">a5c0cf4</a>
removing restriction on node engines.</li>
</ul>
<h2>3.16.2 - Tue, 17 Dec 2024 09:11:34 UTC</h2>
<ul>
<li><a
href="https://github.com/reactjs/react-modal/commit/b91c724">b91c724</a>
updade react and react-dom peer dependencies.</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/a275399">a275399</a>
simplify PR template.</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/588f26b">588f26b</a>
contributing requirements now just need a corresponding issue... on
GitHub board</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/449398d">449398d</a>
remove discussion note from readme.</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/e4841d6">e4841d6</a>
chore: update shouldCloseOnOverlayClick doc</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/6724a04">6724a04</a>
Fix tests</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/7c1d947">7c1d947</a>
Fix badge</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/96a81be">96a81be</a>
Comment the ellipsis in code blocks in docs/index.md</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/aff8b91">aff8b91</a>
[added] add nodejs version restriction to package.json</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/321966e">321966e</a>
[changed] change Miscellaneous related nodejs version text</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/8dc2347">8dc2347</a>
[added] add Miscellaneous section to the contributions.md file</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/f9bc6a0">f9bc6a0</a>
[fixed] strict matching for tabbable nodes</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/e7c4a63">e7c4a63</a>
downgrade node version on github action.</li>
<li><a
href="https://github.com/reactjs/react-modal/commit/1a8f562">1a8f562</a>
running tests on github actions</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7a2a63c91c"><code>7a2a63c</code></a>
Release v3.16.3.</li>
<li><a
href="a5c0cf414d"><code>a5c0cf4</code></a>
removing restriction on node engines.</li>
<li><a
href="8f683027f8"><code>8f68302</code></a>
Release v3.16.2.</li>
<li><a
href="b91c7245b7"><code>b91c724</code></a>
updade react and react-dom peer dependencies.</li>
<li><a
href="a275399059"><code>a275399</code></a>
simplify PR template.</li>
<li><a
href="588f26b060"><code>588f26b</code></a>
contributing requirements now just need a corresponding issue...</li>
<li><a
href="449398da1e"><code>449398d</code></a>
remove discussion note from readme.</li>
<li><a
href="e4841d66d1"><code>e4841d6</code></a>
chore: update shouldCloseOnOverlayClick doc</li>
<li><a
href="6724a049c1"><code>6724a04</code></a>
Fix tests</li>
<li><a
href="7c1d947226"><code>7c1d947</code></a>
Fix badge</li>
<li>Additional commits viewable in <a
href="https://github.com/reactjs/react-modal/compare/v3.16.1...v3.16.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `tailwind-merge` from 2.5.5 to 2.6.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/dcastil/tailwind-merge/releases">tailwind-merge's
releases</a>.</em></p>
<blockquote>
<h2>v2.6.0</h2>
<h3>New Features</h3>
<ul>
<li>Export ConfigExtension type from package by <a
href="https://github.com/dcastil"><code>@​dcastil</code></a> in <a
href="https://redirect.github.com/dcastil/tailwind-merge/pull/505">dcastil/tailwind-merge#505</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/dcastil/tailwind-merge/compare/v2.5.5...v2.6.0">https://github.com/dcastil/tailwind-merge/compare/v2.5.5...v2.6.0</a></p>
<p>Thanks to <a
href="https://github.com/brandonmcconnell"><code>@​brandonmcconnell</code></a>,
<a href="https://github.com/manavm1990"><code>@​manavm1990</code></a>,
<a href="https://github.com/langy"><code>@​langy</code></a>, <a
href="https://github.com/jamesreaco"><code>@​jamesreaco</code></a>, <a
href="https://github.com/roboflow"><code>@​roboflow</code></a>, <a
href="https://github.com/syntaxfm"><code>@​syntaxfm</code></a>, <a
href="https://github.com/getsentry"><code>@​getsentry</code></a>, <a
href="https://github.com/codecov"><code>@​codecov</code></a>, <a
href="https://github.com/sourcegraph"><code>@​sourcegraph</code></a>, a
private sponsor and more via <a
href="https://github.com/thnxdev"><code>@​thnxdev</code></a> for
sponsoring tailwind-merge! ❤️</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1a92c358e0"><code>1a92c35</code></a>
v2.6.0</li>
<li><a
href="64803754e7"><code>6480375</code></a>
add changelog for v2.6.0</li>
<li><a
href="7bb2dc0e02"><code>7bb2dc0</code></a>
Merge pull request <a
href="https://redirect.github.com/dcastil/tailwind-merge/issues/509">#509</a>
from dcastil/renovate/rollup-plugin-node-resolve-16.x</li>
<li><a
href="19eb0a1476"><code>19eb0a1</code></a>
chore(deps): update dependency <code>@​rollup/plugin-node-resolve</code>
to v16</li>
<li><a
href="d6f10146e3"><code>d6f1014</code></a>
Merge pull request <a
href="https://redirect.github.com/dcastil/tailwind-merge/issues/508">#508</a>
from dcastil/renovate/codspeed-vitest-plugin-4.x</li>
<li><a
href="d039e296dd"><code>d039e29</code></a>
chore(deps): update dependency <code>@​codspeed/vitest-plugin</code> to
v4</li>
<li><a
href="4aac490b6f"><code>4aac490</code></a>
Merge pull request <a
href="https://redirect.github.com/dcastil/tailwind-merge/issues/507">#507</a>
from dcastil/renovate/migrate-config</li>
<li><a
href="433e53208a"><code>433e532</code></a>
chore(config): migrate config .github/renovate.json</li>
<li><a
href="31da3f22d7"><code>31da3f2</code></a>
fix unsupported import assertion</li>
<li><a
href="34078eee52"><code>34078ee</code></a>
Merge pull request <a
href="https://redirect.github.com/dcastil/tailwind-merge/issues/506">#506</a>
from dcastil/other/upgrade-github-workflows-to-node-22</li>
<li>Additional commits viewable in <a
href="https://github.com/dcastil/tailwind-merge/compare/v2.5.5...v2.6.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `uuid` from 11.0.3 to 11.0.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/uuidjs/uuid/releases">uuid's
releases</a>.</em></p>
<blockquote>
<h2>v11.0.4</h2>
<h2><a
href="https://github.com/uuidjs/uuid/compare/v11.0.3...v11.0.4">11.0.4</a>
(2025-01-05)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>docs:</strong> insure -&gt; ensure (<a
href="https://redirect.github.com/uuidjs/uuid/issues/843">#843</a>) (<a
href="d2a61e154d">d2a61e1</a>)</li>
<li>exclude tests from published package (<a
href="https://redirect.github.com/uuidjs/uuid/issues/840">#840</a>) (<a
href="f992ff4780">f992ff4</a>)</li>
<li>Test for invalid byte array sizes and ranges in <code>v1()</code>,
<code>v4()</code>, and <code>v7()</code> (<a
href="https://redirect.github.com/uuidjs/uuid/issues/845">#845</a>) (<a
href="e0ee90051e">e0ee900</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/uuidjs/uuid/blob/main/CHANGELOG.md">uuid's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/uuidjs/uuid/compare/v11.0.3...v11.0.4">11.0.4</a>
(2025-01-05)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>docs:</strong> insure -&gt; ensure (<a
href="https://redirect.github.com/uuidjs/uuid/issues/843">#843</a>) (<a
href="d2a61e154d">d2a61e1</a>)</li>
<li>exclude tests from published package (<a
href="https://redirect.github.com/uuidjs/uuid/issues/840">#840</a>) (<a
href="f992ff4780">f992ff4</a>)</li>
<li>Test for invalid byte array sizes and ranges in <code>v1()</code>,
<code>v4()</code>, and <code>v7()</code> (<a
href="https://redirect.github.com/uuidjs/uuid/issues/845">#845</a>) (<a
href="e0ee90051e">e0ee900</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="050cd5b9df"><code>050cd5b</code></a>
chore(main): release 11.0.4 (<a
href="https://redirect.github.com/uuidjs/uuid/issues/842">#842</a>)</li>
<li><a
href="e0ee90051e"><code>e0ee900</code></a>
fix: Test for invalid byte array sizes and ranges in <code>v1()</code>,
<code>v4()</code>, and `v7(...</li>
<li><a
href="6e83b3ae83"><code>6e83b3a</code></a>
chore: update deps (<a
href="https://redirect.github.com/uuidjs/uuid/issues/848">#848</a>)</li>
<li><a
href="5f58b43aa4"><code>5f58b43</code></a>
docs: Ensure link to getrandomvalues-not-supported is maintained (<a
href="https://redirect.github.com/uuidjs/uuid/issues/844">#844</a>)</li>
<li><a
href="d2a61e154d"><code>d2a61e1</code></a>
fix(docs): insure -&gt; ensure (<a
href="https://redirect.github.com/uuidjs/uuid/issues/843">#843</a>)</li>
<li><a
href="f992ff4780"><code>f992ff4</code></a>
fix: exclude tests from published package (<a
href="https://redirect.github.com/uuidjs/uuid/issues/840">#840</a>)</li>
<li><a
href="59df7092c7"><code>59df709</code></a>
docs: add notes on platform support (<a
href="https://redirect.github.com/uuidjs/uuid/issues/838">#838</a>)</li>
<li>See full diff in <a
href="https://github.com/uuidjs/uuid/compare/v11.0.3...v11.0.4">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-07 18:42:25 +00:00
Swifty 0b9c0c9f12
refactor(marketplace): Delete Old marketplace code (#9164)
Needs to be coordinated with Infra PR
https://github.com/Significant-Gravitas/AutoGPT_cloud_infrastructure/pull/20

DO NOT MERGE WITHOUT SYNCING BOTH CHANGES

### Changes 🏗️

- Delete marketplace
2025-01-07 10:02:21 +00:00
dependabot[bot] 7e80401083
chore(libs/deps-dev): bump ruff from 0.8.3 to 0.8.6 in /autogpt_platform/autogpt_libs in the development-dependencies group across 1 directory (#9202)
Bumps the development-dependencies group with 1 update in the
/autogpt_platform/autogpt_libs directory:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.8.3 to 0.8.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.6</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>format</code>]: Preserve multiline implicit concatenated
strings in docstring positions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15126">#15126</a>)</li>
<li>[<code>ruff</code>] Add rule to detect empty literal in deque call
(<code>RUF025</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15104">#15104</a>)</li>
<li>[<code>ruff</code>] Avoid reporting when <code>ndigits</code> is
possibly negative (<code>RUF057</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15234">#15234</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-todos</code>] remove issue code length restriction
(<code>TD003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15175">#15175</a>)</li>
<li>[<code>pyflakes</code>] Ignore errors in <code>@no_type_check</code>
string annotations (<code>F722</code>, <code>F821</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15215">#15215</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Show errors for attempted fixes only when passed
<code>--verbose</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15237">#15237</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>ruff</code>] Avoid syntax error when removing int over
multiple lines (<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15230">#15230</a>)</li>
<li>[<code>pyupgrade</code>] Revert &quot;Add all PEP-585 names to
<code>UP006</code> rule&quot; (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15250">#15250</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/augustelalande"><code>@​augustelalande</code></a></li>
<li><a
href="https://github.com/charliermarsh"><code>@​charliermarsh</code></a></li>
<li><a
href="https://github.com/dcreager"><code>@​dcreager</code></a></li>
<li><a href="https://github.com/dylwil3"><code>@​dylwil3</code></a></li>
<li><a
href="https://github.com/mdbernard"><code>@​mdbernard</code></a></li>
<li><a href="https://github.com/sharkdp"><code>@​sharkdp</code></a></li>
<li><a
href="https://github.com/w0nder1ng"><code>@​w0nder1ng</code></a></li>
</ul>
<h2>Install ruff 0.8.6</h2>
<h3>Install prebuilt binaries via shell script</h3>
<pre lang="sh"><code>curl --proto '=https' --tlsv1.2 -LsSf
https://github.com/astral-sh/ruff/releases/download/0.8.6/ruff-installer.sh
| sh
</code></pre>
<h3>Install prebuilt binaries via powershell script</h3>
<pre lang="sh"><code>powershell -ExecutionPolicy ByPass -c &quot;irm
https://github.com/astral-sh/ruff/releases/download/0.8.6/ruff-installer.ps1
| iex&quot;
</code></pre>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.6</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>format</code>]: Preserve multiline implicit concatenated
strings in docstring positions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15126">#15126</a>)</li>
<li>[<code>ruff</code>] Add rule to detect empty literal in deque call
(<code>RUF025</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15104">#15104</a>)</li>
<li>[<code>ruff</code>] Avoid reporting when <code>ndigits</code> is
possibly negative (<code>RUF057</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15234">#15234</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-todos</code>] remove issue code length restriction
(<code>TD003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15175">#15175</a>)</li>
<li>[<code>pyflakes</code>] Ignore errors in <code>@no_type_check</code>
string annotations (<code>F722</code>, <code>F821</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15215">#15215</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Show errors for attempted fixes only when passed
<code>--verbose</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15237">#15237</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>ruff</code>] Avoid syntax error when removing int over
multiple lines (<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15230">#15230</a>)</li>
<li>[<code>pyupgrade</code>] Revert &quot;Add all PEP-585 names to
<code>UP006</code> rule&quot; (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15250">#15250</a>)</li>
</ul>
<h2>0.8.5</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Extend names moved from core to provider
(<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15145">#15145</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/15159">#15159</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/15196">#15196</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/15216">#15216</a>)</li>
<li>[<code>airflow</code>] Extend rule to check class attributes,
methods, arguments (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15054">#15054</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/15083">#15083</a>)</li>
<li>[<code>fastapi</code>] Update <code>FAST002</code> to check
keyword-only arguments (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15119">#15119</a>)</li>
<li>[<code>flake8-type-checking</code>] Disable <code>TC006</code> and
<code>TC007</code> in stub files (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15179">#15179</a>)</li>
<li>[<code>pylint</code>] Detect nested methods correctly
(<code>PLW1641</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15032">#15032</a>)</li>
<li>[<code>ruff</code>] Detect more strict-integer expressions
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14833">#14833</a>)</li>
<li>[<code>ruff</code>] Implement <code>falsy-dict-get-fallback</code>
(<code>RUF056</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15160">#15160</a>)</li>
<li>[<code>ruff</code>] Implement <code>unnecessary-round</code>
(<code>RUF057</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14828">#14828</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>Visit PEP 764 inline <code>TypedDict</code> keys as
non-type-expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15073">#15073</a>)</li>
<li>[<code>flake8-comprehensions</code>] Skip <code>C416</code> if
comprehension contains unpacking (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14909">#14909</a>)</li>
<li>[<code>flake8-pie</code>] Allow <code>cast(SomeType, ...)</code>
(<code>PIE796</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15141">#15141</a>)</li>
<li>[<code>flake8-simplify</code>] More precise inference for
dictionaries (<code>SIM300</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15164">#15164</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Catch redundant joins in
<code>PTH201</code> and avoid syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15177">#15177</a>)</li>
<li>[<code>pycodestyle</code>] Preserve original value format
(<code>E731</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15097">#15097</a>)</li>
<li>[<code>pydocstyle</code>] Split on first whitespace character
(<code>D403</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15082">#15082</a>)</li>
<li>[<code>pyupgrade</code>] Add all PEP-585 names to <code>UP006</code>
rule (<a
href="https://redirect.github.com/astral-sh/ruff/pull/5454">#5454</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>[<code>flake8-type-checking</code>] Improve flexibility of
<code>runtime-evaluated-decorators</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15204">#15204</a>)</li>
<li>[<code>pydocstyle</code>] Add setting to ignore missing
documentation for <code>*args</code> and <code>**kwargs</code>
parameters (<code>D417</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15210">#15210</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6b907c1305"><code>6b907c1</code></a>
Ruff 0.8.6 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15253">#15253</a>)</li>
<li><a
href="f319531632"><code>f319531</code></a>
Make unreachable a test rule for now (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15252">#15252</a>)</li>
<li><a
href="e4d9fe036a"><code>e4d9fe0</code></a>
Revert &quot;Add all PEP-585 names to UP006 rule&quot; (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15250">#15250</a>)</li>
<li><a
href="baf0d660eb"><code>baf0d66</code></a>
Update salsa (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15243">#15243</a>)</li>
<li><a
href="bde8ecddca"><code>bde8ecd</code></a>
[red-knot] Remove unneeded branch in
<code>Type::is_equivalent_to()</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15242">#15242</a>)</li>
<li><a
href="842f882ef0"><code>842f882</code></a>
[<code>ruff</code>] Avoid reporting when <code>ndigits</code> is
possibly negative (<code>RUF057</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15234">#15234</a>)</li>
<li><a
href="75015b0ed9"><code>75015b0</code></a>
Attribute panics to the mdtests that cause them (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15241">#15241</a>)</li>
<li><a
href="706d87f239"><code>706d87f</code></a>
Show errors for attempted fixes only when passed <code>--verbose</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/issues/15237">#15237</a>)</li>
<li><a
href="0837cdd931"><code>0837cdd</code></a>
[<code>RUF</code>] Add rule to detect empty literal in deque call
(<code>RUF025</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15104">#15104</a>)</li>
<li><a
href="0dbfa8d0e0"><code>0dbfa8d</code></a>
TD003: remove issue code length restriction (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15175">#15175</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.3...0.8.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.8.3&new-version=0.8.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Swifty <craigswift13@gmail.com>
2025-01-07 07:56:41 +00:00
Reinier van der Leer 96fae5a5c8
fix(backend): Fix intermittent failure of `test_agent_execution` (#9210)
- Fixed race condition in `create_graph` to preserve node order
- Resolves #9123
2025-01-07 03:42:33 +00:00
Reinier van der Leer 7a9a771718
fix(backend): Fix webhook ingress URL generation (#9209)
The enum's string *representation* was being inserted in the URL instead
of its string *value*.

Before:
`/api/integrations/ProviderName.GITHUB/webhooks/686db48c-e70d-4340-acf9-ccd0338fddc4/ingress`

After:
`/api/integrations/github/webhooks/686db48c-e70d-4340-acf9-ccd0338fddc4/ingress`

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-06 23:45:35 +00:00
Reinier van der Leer c3caa111e4
feat(backend/executor): Add `TERMINATED` execution status (#9185)
- Resolves #9182

Formerly known as `FAILED` with error message `TERMINATED`.

### Changes 🏗️

- Add `TERMINATED` to `AgentExecutionStatus` enum in DB schema (and its
mirror in the front end)
- Update executor to give terminated node and graph executions status
`TERMINATED` instead of `FAILED`/`COMPLETED`
- Add `TERMINATED` case to status checks referencing
`AgentExecutionStatus`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Start and forcefully stop a graph execution

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-06 22:59:49 +00:00
Reinier van der Leer 081c4a6df2
dx: Fix `isort` pre-commit hooks
Since upgrading to Poetry v2.0.0 the -C flag has been renamed to -P
2025-01-07 00:02:39 +01:00
Reinier van der Leer d638c1f484
Fix Poetry v2.0.0 compatibility (#9197)
Make all changes necessary to make everything work with Poetry v2.0.0.

- Resolves #9196

## Changes
- Removed `--no-update` flag from `poetry lock` command in codebase
- Removed extra path arguments from `poetry -C [path] run [command]`
occurrences
- Regenerated all lock files in hierarchical order
- Added workaround for Poetry bug where `packages.[i].format` is now
suddenly required

Additionally:
- Fixed up .dockerignore
  - Fixes .venv being erroneously copied over from local
  - Fixes build context bloat (300MB -> 2.5MB)
- Fixed warnings about entrypoint script not being installed in docker
builds

### Relevant (breaking) changes in v2.0.0
- `--no-update` flag no longer exists for `poetry lock` as it has become
default behavior
- The `-C` option now actually changes the directory, so any path
arguments in `poetry run` commands can/must be removed
- Poetry v2.0.0 uses the new v2.1 lock file spec, so all lock files have
to be regenerated to avoid false-positive lock file updates and checks
on future PRs
- **BUG:** when specifying `poetry.tool.packages`, `format` is required
now
  - python-poetry/poetry#9961

Full Poetry v2.0.0 release notes and change log:
https://python-poetry.org/blog/announcing-poetry-2.0.0
2025-01-06 23:34:49 +01:00
Abhimanyu Yadav 0872da1969
fix(store) : Download agent from store if user is not logged in (#9121)
- resolves - #9120 

### Changes 

- Added a new endpoint to download agent files as JSON, allowing users
to retrieve agent data by store listing version ID and version number.
- Introduced a new `get_agent` function in the database module to fetch
agent details and prepare the graph data for download.
- Enhanced the frontend `AgentInfo` component to include a download
button, which triggers the download of the agent file.
- Integrated loading state and user feedback via toast notifications
during the download process.
- Updated the API client to support the new download functionality.

### Demo video 



https://github.com/user-attachments/assets/6744a753-297f-4ccc-abde-f56ca24ed2d5

### Example Json

```json
{
  "id": "14378095-4cc5-41ea-975e-bd0bce010bea",
  "version": 1,
  "is_active": true,
  "is_template": false,
  "name": "something",
  "description": "1",
  "nodes": [
    {
      "id": "6914efa0-e4fa-4ce8-802c-d5577cf061b6",
      "block_id": "aeb08fc1-2fc1-4141-bc8e-f758f183a822",
      "input_default": {},
      "metadata": {
        "position": {
          "x": 756,
          "y": 452.5
        }
      },
      "input_links": [],
      "output_links": [],
      "webhook_id": null,
      "graph_id": "14378095-4cc5-41ea-975e-bd0bce010bea",
      "graph_version": 1,
      "webhook": null
    }
  ],
  "links": [],
  "input_schema": {
    "type": "object",
    "properties": {},
    "required": []
  },
  "output_schema": {
    "type": "object",
    "properties": {},
    "required": []
  }
}
```

---------

Co-authored-by: SwiftyOS <craigswift13@gmail.com>
2025-01-03 17:21:15 +00:00
Reinier van der Leer 1375a0fdbc
feat(platform): Support multiple credentials inputs on blocks (#8932)
- Resolves #8930
- Depends on #8725

### Changes 🏗️

- feat(platform): Support multiple credentials inputs on blocks

Aside from `credentials`, fields within the name pattern `*_credentials`
are now also supported!

- Update docs with info on multi credentials support

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Ask @aarushik93 to test
2025-01-03 15:18:57 +00:00
Bently 84af37a27a
refactor(blocks): Move some GitHub blocks to correct file (#9180)
This moves my recently added blocks: ``GithubCreateFileBlock``,
``GithubUpdateFileBlock``, ``GithubCreateRepositoryBlock`` and
``GithubListStargazersBlock`` to the correct file ``github/repo.py`` as
i placed them in the wrong file originally
2025-01-03 15:02:53 +00:00
Zamil Majdy 8f1a065976
fix(frontend): Make input layout & padding consistent (#9170)
There are a few hardcoded margins and padding in the block input layout,
causing the input to sometimes overflow or be used inconsistently.


![image](https://github.com/user-attachments/assets/8a9b8e0d-04fd-4660-94d3-5dfe69cbc77d)


### Changes 🏗️

* Make padding consistent between left & right, top & bottom.
* Remove hard-coded margins.
* Match the hardcode negative margin for the right node handle to the
left node handle.
* Make the input box take the full width.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-03 10:59:06 +00:00
dependabot[bot] e7689a1eb7
chore(market/deps-dev): bump the development-dependencies group across 1 directory with 3 updates (#9165)
Bumps the development-dependencies group with 3 updates in the
/autogpt_platform/market directory:
[pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio),
[ruff](https://github.com/astral-sh/ruff) and
[pyright](https://github.com/RobertCraigie/pyright-python).

Updates `pytest-asyncio` from 0.25.0 to 0.25.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pytest-dev/pytest-asyncio/releases">pytest-asyncio's
releases</a>.</em></p>
<blockquote>
<h2>pytest-asyncio 0.25.1</h2>
<ul>
<li>Fixes an issue that caused a broken event loop when a
function-scoped test was executed in between two tests with wider loop
scope <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/950">#950</a></li>
<li>Improves test collection speed in auto mode <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/pull/1020">#1020</a></li>
<li>Corrects the warning that is emitted upon redefining the event_loop
fixture</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="623ab74b80"><code>623ab74</code></a>
docs: Prepare release of v0.25.1.</li>
<li><a
href="c236550e73"><code>c236550</code></a>
docs: Fix broken link to the pytest.mark.asyncio reference.</li>
<li><a
href="41c645b3b7"><code>41c645b</code></a>
fix: Correct warning message when redefining the event_loop
fixture.</li>
<li><a
href="2fd10f8243"><code>2fd10f8</code></a>
docs: Clarify deprecation of event_loop fixture.</li>
<li><a
href="a4e82ab25b"><code>a4e82ab</code></a>
docs: Added changelog entry for <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/1020">#1020</a>.</li>
<li><a
href="04f90445e1"><code>04f9044</code></a>
refactor: Replace the &quot;__original_fixture_loop&quot; magic
attribute with the more...</li>
<li><a
href="dafef6c65b"><code>dafef6c</code></a>
refactor: Extracted a function to mark an event loop as created by
pytest-asy...</li>
<li><a
href="0c931b7eab"><code>0c931b7</code></a>
refactor: Extracted function to check if a loop was created by
pytest-asyncio.</li>
<li><a
href="0642dcd27b"><code>0642dcd</code></a>
fix: Fix broken event loop when a function-scoped test is in between two
wide...</li>
<li><a
href="050a5f81c9"><code>050a5f8</code></a>
[pre-commit.ci] pre-commit autoupdate</li>
<li>Additional commits viewable in <a
href="https://github.com/pytest-dev/pytest-asyncio/compare/v0.25.0...v0.25.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `ruff` from 0.8.3 to 0.8.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.4</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Extend <code>AIR302</code> with additional
functions and classes (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15015">#15015</a>)</li>
<li>[<code>airflow</code>] Implement <code>moved-to-provider-in-3</code>
for modules that has been moved to Airflow providers
(<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14764">#14764</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Extend check for invalid path
suffix to include the case <code>&quot;.&quot;</code>
(<code>PTH210</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14902">#14902</a>)</li>
<li>[<code>perflint</code>] Fix panic in <code>PERF401</code> when list
variable is after the <code>for</code> loop (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14971">#14971</a>)</li>
<li>[<code>perflint</code>] Simplify finding the loop target in
<code>PERF401</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15025">#15025</a>)</li>
<li>[<code>pylint</code>] Preserve original value format
(<code>PLR6104</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14978">#14978</a>)</li>
<li>[<code>ruff</code>] Avoid false positives for <code>RUF027</code>
for typing context bindings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15037">#15037</a>)</li>
<li>[<code>ruff</code>] Check for ambiguous pattern passed to
<code>pytest.raises()</code> (<code>RUF043</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14966">#14966</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bandit</code>] Check <code>S105</code> for annotated
assignment (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15059">#15059</a>)</li>
<li>[<code>flake8-pyi</code>] More autofixes for
<code>redundant-none-literal</code> (<code>PYI061</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14872">#14872</a>)</li>
<li>[<code>pydocstyle</code>] Skip leading whitespace for
<code>D403</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14963">#14963</a>)</li>
<li>[<code>ruff</code>] Skip <code>SQLModel</code> base classes for
<code>mutable-class-default</code> (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14949">#14949</a>)</li>
</ul>
<h3>Bug</h3>
<ul>
<li>[<code>perflint</code>] Parenthesize walrus expressions in autofix
for <code>manual-list-comprehension</code> (<code>PERF401</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15050">#15050</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Check diagnostic refresh support from client capability which
enables dynamic configuration for various editors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15014">#15014</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/DimitriPapadopoulos"><code>@​DimitriPapadopoulos</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a href="https://github.com/TheBits"><code>@​TheBits</code></a></li>
<li><a
href="https://github.com/cake-monotone"><code>@​cake-monotone</code></a></li>
<li><a href="https://github.com/carljm"><code>@​carljm</code></a></li>
<li><a
href="https://github.com/dcreager"><code>@​dcreager</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
<li><a href="https://github.com/dylwil3"><code>@​dylwil3</code></a></li>
<li><a
href="https://github.com/github-actions"><code>@​github-actions</code></a></li>
<li><a
href="https://github.com/kiran-4444"><code>@​kiran-4444</code></a></li>
<li><a
href="https://github.com/krishnan-chandra"><code>@​krishnan-chandra</code></a></li>
<li><a
href="https://github.com/rchen152"><code>@​rchen152</code></a></li>
<li><a
href="https://github.com/renovate"><code>@​renovate</code></a></li>
<li><a href="https://github.com/sharkdp"><code>@​sharkdp</code></a></li>
<li><a
href="https://github.com/tarasmatsyk"><code>@​tarasmatsyk</code></a></li>
<li><a
href="https://github.com/w0nder1ng"><code>@​w0nder1ng</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.4</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Extend <code>AIR302</code> with additional
functions and classes (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15015">#15015</a>)</li>
<li>[<code>airflow</code>] Implement <code>moved-to-provider-in-3</code>
for modules that has been moved to Airflow providers
(<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14764">#14764</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Extend check for invalid path
suffix to include the case <code>&quot;.&quot;</code>
(<code>PTH210</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14902">#14902</a>)</li>
<li>[<code>perflint</code>] Fix panic in <code>PERF401</code> when list
variable is after the <code>for</code> loop (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14971">#14971</a>)</li>
<li>[<code>perflint</code>] Simplify finding the loop target in
<code>PERF401</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15025">#15025</a>)</li>
<li>[<code>pylint</code>] Preserve original value format
(<code>PLR6104</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14978">#14978</a>)</li>
<li>[<code>ruff</code>] Avoid false positives for <code>RUF027</code>
for typing context bindings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15037">#15037</a>)</li>
<li>[<code>ruff</code>] Check for ambiguous pattern passed to
<code>pytest.raises()</code> (<code>RUF043</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14966">#14966</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bandit</code>] Check <code>S105</code> for annotated
assignment (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15059">#15059</a>)</li>
<li>[<code>flake8-pyi</code>] More autofixes for
<code>redundant-none-literal</code> (<code>PYI061</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14872">#14872</a>)</li>
<li>[<code>pydocstyle</code>] Skip leading whitespace for
<code>D403</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14963">#14963</a>)</li>
<li>[<code>ruff</code>] Skip <code>SQLModel</code> base classes for
<code>mutable-class-default</code> (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14949">#14949</a>)</li>
</ul>
<h3>Bug</h3>
<ul>
<li>[<code>perflint</code>] Parenthesize walrus expressions in autofix
for <code>manual-list-comprehension</code> (<code>PERF401</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15050">#15050</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Check diagnostic refresh support from client capability which
enables dynamic configuration for various editors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15014">#15014</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3bb0dac235"><code>3bb0dac</code></a>
Bump version to 0.8.4 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15064">#15064</a>)</li>
<li><a
href="40cba5dc8a"><code>40cba5d</code></a>
[red-knot] Cleanup various <code>todo_type!()</code> messages (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15063">#15063</a>)</li>
<li><a
href="596d80cc8e"><code>596d80c</code></a>
[<code>perflint</code>] Parenthesize walrus expressions in autofix for
`manual-list-comp...</li>
<li><a
href="d8b9a366c8"><code>d8b9a36</code></a>
Disable actionlint hook by default when running pre-commit locally (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15061">#15061</a>)</li>
<li><a
href="85e71ba91a"><code>85e71ba</code></a>
[<code>flake8-bandit</code>] Check <code>S105</code> for annotated
assignment (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15059">#15059</a>)</li>
<li><a
href="2802cbde29"><code>2802cbd</code></a>
Don't special-case class instances in unary expression inference (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15045">#15045</a>)</li>
<li><a
href="ed2bce6ebb"><code>ed2bce6</code></a>
[red-knot] Report invalid exceptions (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15042">#15042</a>)</li>
<li><a
href="f0012df686"><code>f0012df</code></a>
Fix typos in <code>RUF043.py</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15044">#15044</a>)</li>
<li><a
href="0fc4e8f795"><code>0fc4e8f</code></a>
Introduce <code>InferContext</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14956">#14956</a>)</li>
<li><a
href="ac81c72bf3"><code>ac81c72</code></a>
[<code>ruff</code>] Ambiguous pattern passed to
<code>pytest.raises()</code> (<code>RUF043</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14966">#14966</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.3...0.8.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `pyright` from 1.1.390 to 1.1.391
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3356df1d40"><code>3356df1</code></a>
[pyright updated to 1.1.391] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/327">#327</a>)</li>
<li>See full diff in <a
href="https://github.com/RobertCraigie/pyright-python/compare/v1.1.390...v1.1.391">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-03 09:49:36 +00:00
Bently fe8393a82f
feat(blocks): Add github list stargazers block (#9172)
This adds a list stargazers block, its using
https://docs.github.com/en/rest/activity/starring?apiVersion=2022-11-28#list-stargazers


![image](https://github.com/user-attachments/assets/0fe87a97-ebea-40c2-818d-28f6555ae91c)
2025-01-03 09:48:51 +00:00
Reinier van der Leer fa98827fd1
fix(backend): Fix validation of hostname-less URLs (#9171)
Previously, `http://` would be converted to `http://http` and pass the
no-hostname check that way. It eventually fails validation, but only at
hostname lookup which times out -> takes very long.

### Changes 🏗️

- Fix URL canonicalization logic
- Merge `_canonicalize_url` into `validate_url`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] CI
2025-01-03 09:48:30 +00:00
dependabot[bot] d7d69f397f
chore(frontend/deps-dev): bump the development-dependencies group across 1 directory with 5 updates (#9150)
Bumps the development-dependencies group with 5 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
|
[@chromatic-com/storybook](https://github.com/chromaui/addon-visual-tests)
| `3.2.2` | `3.2.3` |
| [@storybook/test-runner](https://github.com/storybookjs/test-runner) |
`0.20.1` | `0.21.0` |
| [concurrently](https://github.com/open-cli-tools/concurrently) |
`9.1.0` | `9.1.1` |
|
[eslint-config-next](https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next)
| `15.1.0` | `15.1.3` |
| [tailwindcss](https://github.com/tailwindlabs/tailwindcss) | `3.4.16`
| `3.4.17` |


Updates `@chromatic-com/storybook` from 3.2.2 to 3.2.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/addon-visual-tests/releases"><code>@​chromatic-com/storybook</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v3.2.3</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Fix reading <code>status</code> of <code>undefined</code> in urql's
<code>didAuthError</code> handler <a
href="https://redirect.github.com/chromaui/addon-visual-tests/pull/349">#349</a>
(<a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>)</li>
<li>Add steps to link for local testing <a
href="https://redirect.github.com/chromaui/addon-visual-tests/pull/347">#347</a>
(<a href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>Cody Kaup (<a
href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
<li>Gert Hengeveld (<a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/addon-visual-tests/blob/main/CHANGELOG.md"><code>@​chromatic-com/storybook</code>'s
changelog</a>.</em></p>
<blockquote>
<h1>v3.2.3 (Thu Dec 19 2024)</h1>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Fix reading <code>status</code> of <code>undefined</code> in urql's
<code>didAuthError</code> handler <a
href="https://redirect.github.com/chromaui/addon-visual-tests/pull/349">#349</a>
(<a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>)</li>
<li>Add steps to link for local testing <a
href="https://redirect.github.com/chromaui/addon-visual-tests/pull/347">#347</a>
(<a href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>Cody Kaup (<a
href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
<li>Gert Hengeveld (<a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="871413fd8a"><code>871413f</code></a>
Bump version to: 3.2.3 [skip ci]</li>
<li><a
href="a29f901536"><code>a29f901</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="982f9cba43"><code>982f9cb</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/addon-visual-tests/issues/349">#349</a>
from chromaui/fix-auth-error-handler</li>
<li><a
href="35f989a26b"><code>35f989a</code></a>
Request is optional on error object</li>
<li><a
href="ec0a952a47"><code>ec0a952</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/addon-visual-tests/issues/347">#347</a>
from chromaui/cody/cap-2346-write-up-doc-on-how-to-se...</li>
<li><a
href="43ef127485"><code>43ef127</code></a>
Add steps to link for local testing</li>
<li>See full diff in <a
href="https://github.com/chromaui/addon-visual-tests/compare/v3.2.2...v3.2.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/test-runner` from 0.20.1 to 0.21.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/test-runner/releases"><code>@​storybook/test-runner</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v0.21.0</h2>
<h4>🚀 Enhancement</h4>
<ul>
<li>Release 0.21.0 <a
href="https://redirect.github.com/storybookjs/test-runner/pull/527">#527</a>
(<a href="https://github.com/kaelig"><code>@​kaelig</code></a> <a
href="https://github.com/guspan-tanadi"><code>@​guspan-tanadi</code></a>
<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Feature: Add --listTests flag from Jest <a
href="https://redirect.github.com/storybookjs/test-runner/pull/521">#521</a>
(<a href="https://github.com/kaelig"><code>@​kaelig</code></a>)</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li>style(README): highlight Markdown Note section <a
href="https://redirect.github.com/storybookjs/test-runner/pull/523">#523</a>
(<a
href="https://github.com/guspan-tanadi"><code>@​guspan-tanadi</code></a>)</li>
<li>Fix: Handle RSC errors <a
href="https://redirect.github.com/storybookjs/test-runner/pull/526">#526</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 3</h4>
<ul>
<li>Guspan Tanadi (<a
href="https://github.com/guspan-tanadi"><code>@​guspan-tanadi</code></a>)</li>
<li>Kaelig Deloumeau-Prigent (<a
href="https://github.com/kaelig"><code>@​kaelig</code></a>)</li>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h2>v0.21.0-next.1</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Fix: Handle RSC errors <a
href="https://redirect.github.com/storybookjs/test-runner/pull/526">#526</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h2>v0.21.0-next.0</h2>
<h4>🚀 Enhancement</h4>
<ul>
<li>Feature: Add --listTests flag from Jest <a
href="https://redirect.github.com/storybookjs/test-runner/pull/521">#521</a>
(<a href="https://github.com/kaelig"><code>@​kaelig</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Kaelig Deloumeau-Prigent (<a
href="https://github.com/kaelig"><code>@​kaelig</code></a>)</li>
</ul>
<h2>v0.20.2-next.0</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Fix postVisit hook issue <a
href="https://redirect.github.com/storybookjs/test-runner/pull/519">#519</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/test-runner/blob/v0.21.0/CHANGELOG.md"><code>@​storybook/test-runner</code>'s
changelog</a>.</em></p>
<blockquote>
<h1>v0.21.0 (Fri Dec 20 2024)</h1>
<h4>🚀 Enhancement</h4>
<ul>
<li>Release 0.21.0 <a
href="https://redirect.github.com/storybookjs/test-runner/pull/527">#527</a>
(<a href="https://github.com/kaelig"><code>@​kaelig</code></a> <a
href="https://github.com/guspan-tanadi"><code>@​guspan-tanadi</code></a>
<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Feature: Add --listTests flag from Jest <a
href="https://redirect.github.com/storybookjs/test-runner/pull/521">#521</a>
(<a href="https://github.com/kaelig"><code>@​kaelig</code></a>)</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li>style(README): highlight Markdown Note section <a
href="https://redirect.github.com/storybookjs/test-runner/pull/523">#523</a>
(<a
href="https://github.com/guspan-tanadi"><code>@​guspan-tanadi</code></a>)</li>
<li>Fix: Handle RSC errors <a
href="https://redirect.github.com/storybookjs/test-runner/pull/526">#526</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 3</h4>
<ul>
<li>Guspan Tanadi (<a
href="https://github.com/guspan-tanadi"><code>@​guspan-tanadi</code></a>)</li>
<li>Kaelig Deloumeau-Prigent (<a
href="https://github.com/kaelig"><code>@​kaelig</code></a>)</li>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="301bdaed74"><code>301bdae</code></a>
Bump version to: 0.21.0 [skip ci]</li>
<li><a
href="e51456ee77"><code>e51456e</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="f09c9258ac"><code>f09c925</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/527">#527</a>
from storybookjs/release/v0.21.0</li>
<li><a
href="5cd46e3655"><code>5cd46e3</code></a>
Merge branch 'main' into release/v0.21.0</li>
<li><a
href="30d892498f"><code>30d8924</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/523">#523</a>
from guspan-tanadi/notehighlight</li>
<li><a
href="9cb0ec2308"><code>9cb0ec2</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/526">#526</a>
from storybookjs/fix-rsc-error-handling</li>
<li><a
href="b21c854545"><code>b21c854</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/521">#521</a>
from kaelig/add-listTests</li>
<li><a
href="b7a6bca7ce"><code>b7a6bca</code></a>
handle RSC errors</li>
<li><a
href="36573a00b7"><code>36573a0</code></a>
style(README): highlight Markdown Note section</li>
<li><a
href="740607eb10"><code>740607e</code></a>
Add listTests flag from Jest</li>
<li>See full diff in <a
href="https://github.com/storybookjs/test-runner/compare/v0.20.1...v0.21.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `concurrently` from 9.1.0 to 9.1.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/open-cli-tools/concurrently/releases">concurrently's
releases</a>.</em></p>
<blockquote>
<h2>v9.1.1</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: support Deno's JSON with comments configuration by <a
href="https://github.com/mahtaran"><code>@​mahtaran</code></a> in <a
href="https://redirect.github.com/open-cli-tools/concurrently/pull/523">open-cli-tools/concurrently#523</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/open-cli-tools/concurrently/compare/v9.1.0...v9.1.1">https://github.com/open-cli-tools/concurrently/compare/v9.1.0...v9.1.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6cafc606a3"><code>6cafc60</code></a>
9.1.1</li>
<li><a
href="80fceda02e"><code>80fceda</code></a>
fix: support Deno's JSON with comments (<a
href="https://redirect.github.com/open-cli-tools/concurrently/issues/523">#523</a>)</li>
<li><a
href="8d3f9761bf"><code>8d3f976</code></a>
docs: fix inconsistencies in passthrough args page</li>
<li>See full diff in <a
href="https://github.com/open-cli-tools/concurrently/compare/v9.1.0...v9.1.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `eslint-config-next` from 15.1.0 to 15.1.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases">eslint-config-next's
releases</a>.</em></p>
<blockquote>
<h2>v15.1.3</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>Retry manifest file loading only in dev mode: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/73900">#73900</a></li>
<li>Use shared worker for lint &amp; typecheck steps: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/74154">#74154</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/unstubbable"><code>@​unstubbable</code></a> and
<a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.2</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>Update React from 7283a213-20241206 to 65e06cb7-20241218: <a
href="https://redirect.github.com/vercel/next.js/pull/74117">vercel/next.js#74117</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.1</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>fix(turbo): sassOptions silenceDeprecations was not overwritten with
user options: <a
href="https://redirect.github.com/vercel/next.js/pull/73937">vercel/next.js#73937</a></li>
<li>refactor collectAppPageSegments: <a
href="https://redirect.github.com/vercel/next.js/pull/73908">vercel/next.js#73908</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/devjiwonchoi"><code>@​devjiwonchoi</code></a>
and <a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.1-canary.23</h2>
<h3>Misc Changes</h3>
<ul>
<li>docs: remove catch-all for opengraph-image: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/74338">#74338</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/leerob"><code>@​leerob</code></a> for
helping!</p>
<h2>v15.1.1-canary.22</h2>
<h3>Misc Changes</h3>
<ul>
<li>Fix typo in generateViewport docs: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/74288">#74288</a></li>
</ul>
<h3>Credits</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4cbaaa118d"><code>4cbaaa1</code></a>
v15.1.3</li>
<li><a
href="df392a1b97"><code>df392a1</code></a>
v15.1.2</li>
<li><a
href="4384c6834a"><code>4384c68</code></a>
v15.1.1</li>
<li>See full diff in <a
href="https://github.com/vercel/next.js/commits/v15.1.3/packages/eslint-config-next">compare
view</a></li>
</ul>
</details>
<br />

Updates `tailwindcss` from 3.4.16 to 3.4.17
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/tailwindcss/releases">tailwindcss's
releases</a>.</em></p>
<blockquote>
<h2>v3.4.17</h2>
<h3>Fixed</h3>
<ul>
<li>Work around Node v22.12+ issue (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/15421">#15421</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/tailwindcss/blob/v3.4.17/CHANGELOG.md">tailwindcss's
changelog</a>.</em></p>
<blockquote>
<h2>[3.4.17] - 2024-12-17</h2>
<h3>Fixed</h3>
<ul>
<li>Work around Node v22.12+ issue (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/pull/15421">#15421</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4f9f603e12"><code>4f9f603</code></a>
Fix error</li>
<li><a
href="02faa1529e"><code>02faa15</code></a>
v3.4.17</li>
<li><a
href="e268b2aa96"><code>e268b2a</code></a>
Update changelog</li>
<li><a
href="0a836f76bb"><code>0a836f7</code></a>
Work around issue with Node 22 and Jiti (<a
href="https://redirect.github.com/tailwindlabs/tailwindcss/issues/15421">#15421</a>)</li>
<li>See full diff in <a
href="https://github.com/tailwindlabs/tailwindcss/compare/v3.4.16...v3.4.17">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-02 14:46:55 +00:00
Bently 858dc7adc3
feat(blocks): Add github create repo block (#9169)
This adds a new block, Github Create Repository Block
(GithubCreateRepositoryBlock) which lets you create a new github repo.

i have used this to make these repos so i know it works 

https://github.com/Bentlybro/discord-whisper-transcriber
https://github.com/Bentlybro/PyAGI-Framework
https://github.com/Bentlybro/FlaskNotes


![image](https://github.com/user-attachments/assets/17127839-7dc9-4b8a-bc60-8b55cb02fde9)
2025-01-02 14:46:14 +00:00
Bently 745aae4aec
feat(blocks): Add github create file block (#9144)
This adds 2 blocks, a Github Create File Block (GithubCreateFileBlock)
and Github Update File Block (GithubUpdateFileBlock)

These allow you to create files and update files on github, i used it to
make all the files that are on my repo here
https://github.com/Bentlybro/AGPT-Testing/commits/main/


![image](https://github.com/user-attachments/assets/ba97b30f-fd32-470d-a5ff-90042f0d9b75)

![image](https://github.com/user-attachments/assets/11d0ecca-f597-4b2b-9df4-cd81fe5a3ca9)

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2025-01-02 09:23:55 +00:00
SwiftyOS 5959c0d303 Revert "remove marketplace"
This reverts commit 480c4773bf.
2025-01-02 10:28:29 +01:00
SwiftyOS 480c4773bf remove marketplace 2025-01-02 10:28:15 +01:00
Zamil Majdy 1ce1918967
fix(platform): Fields with default value are not set to advanced by default (#9128)
https://github.com/Significant-Gravitas/AutoGPT/issues/8739 causes input
fields that are supposed to be an advanced field end up being a
mandatory field:


![image](https://github.com/user-attachments/assets/1cb41a79-fe85-4012-91b8-861bd5f9a0ca)

*See the retry count field here.

### Changes 🏗️

Set the `advanced` field on each input field, and set the default value
using this logic:
* If it has a default value, set it to True.
* otherwise, False.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-12-31 16:19:23 +00:00
Zamil Majdy 314b04eaba
feat(backend): Make scheduler DB connection pool configurable & prevent connection overflow (#9149)
max_overflow parameter description:
```
    :param max_overflow=10: the number of connections to allow in
        connection pool "overflow", that is connections that can be
        opened above and beyond the pool_size setting, which defaults
        to five. this is only used with :class:`~sqlalchemy.pool.QueuePool`.
```

### Changes 🏗️

* Prevent additional db connections from being created in addition to
the pool size for the scheduler.
* Make the pool size configurable.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-12-31 16:19:15 +00:00
Zamil Majdy 26214e1b2c
fix(backend): Prevent HTTP requests access to internal IPV6 addresses for Agent Blocks (#9157)
Addresses:
https://github.com/Significant-Gravitas/AutoGPT/security/advisories/GHSA-4c8v-hwxc-2356

Currently, no IPv6 is used by default on this system. However, the lack
of block HTTP access prevention to internal systems with IPv6 could be a
potential SSRF.

### Changes 🏗️

Prevent internal IPv6 address access on HTTP request blocks.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-31 16:18:57 +00:00
Zamil Majdy 10fc7d2114
fix(backend): Remove croniter (#9130)
Croniter is unused, and it will be deprecated soon.

### Changes 🏗️

Remove croniter.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-12-31 15:24:29 +00:00
Zamil Majdy ea01c8038b
fix(frontend): Fix broken block UI layout (#9132)
https://github.com/Significant-Gravitas/AutoGPT/pull/9097/files#diff-ef176e50a6a65af5df2182626ea868ce77b76de447c816fb4f80fb4d376c3049R7-R41
introduced styling changes to block UI layout which causes the block
layout broken:


![image](https://github.com/user-attachments/assets/0d3d6e61-1acc-440c-9c7b-8cc473b457ea)

This PR minimally reverts the styling change.

### Changes 🏗️

Minimal CSS revert to make the block UI layout back to normal.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-31 09:13:47 +01:00
Zamil Majdy a646e60d2f
fix(backend): Added locking status check before releasing to avoid releasing timing out lock (#9135)
Exception:
```
nid:ce829f66-14b0-4bd3-b748-791e46666cb6|-] Failed node execution ce829f66-14b0-4bd3-b748-791e46666cb6: Cannot release an unlocked lock {}\u001b[0m",
Traceback (most recent call last):\n  File \"/app/autogpt_platform/backend/backend/integrations/creds_manager.py\", line 145, in _locked\n    yield\n  File \"/app/autogpt_platform/backend/backend/integrations/creds_manager.py\", line 115, in acquire\n    lock = self._acquire_lock(user_id, credentials_id)",
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^",
  File \"/app/autogpt_platform/backend/backend/integrations/creds_manager.py\", line 139, in _acquire_lock",
    return self._locks.acquire(key)",
           ^^^^^^^^^^^^^^^^^^^^^^^^",
  File \"/app/autogpt_platform/autogpt_libs/autogpt_libs/utils/synchronize.py\", line 44, in acquire",
    lock.acquire()",
  File \"/usr/local/lib/python3.11/site-packages/redis/lock.py\", line 218, in acquire",
    mod_time.sleep(sleep)",
  File \"/app/autogpt_platform/backend/backend/executor/manager.py\", line 471, in <lambda>",
    signal.SIGTERM, lambda _, __: cls.on_node_executor_sigterm()",
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^",
  File \"/app/autogpt_platform/backend/backend/executor/manager.py\", line 498, in on_node_executor_sigterm",
    sys.exit(0)",
SystemExit: 0",
During handling of the above exception, another exception occurred:",
Traceback (most recent call last):\n  File \"/app/autogpt_platform/backend/backend/executor/manager.py\", line 539, in _on_node_execution\n    for execution in execute_node(\n  File \"/app/autogpt_platform/backend/backend/executor/manager.py\", line 175, in execute_node\n    credentials, creds_lock = creds_manager.acquire(user_id, credentials_meta.id)",
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^",
  File \"/app/autogpt_platform/backend/backend/integrations/creds_manager.py\", line 114, in acquire",
    with self._locked(user_id, credentials_id, \"!time_sensitive\"):",
  File \"/usr/local/lib/python3.11/contextlib.py\", line 158, in __exit__",
    self.gen.throw(typ, value, traceback)",
  File \"/app/autogpt_platform/backend/backend/integrations/creds_manager.py\", line 147, in _locked",
    lock.release()",
  File \"/usr/local/lib/python3.11/site-packages/redis/lock.py\", line 254, in release",
    raise LockError(\"Cannot release an unlocked lock\", lock_name=self.name)",
redis.exceptions.LockError: Cannot release an unlocked lock",
```

### Changes 🏗️

```
try:
   lock.acquire()
   ...
finally:
   lock.release()
```

pattern can cause an error where the lock is already released due to
timeout.

The scope of the change is to manually check the lock status before
releasing.


### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-31 08:48:04 +01:00
Krzysztof Czerwinski 15af2f410b
refactor(frontend): Auth pages update (#9124)
There are UX and design issues with current auth pages; `login`,
`signup` and `reset_password` (including change password).

### Changes 🏗️


![auth](https://github.com/user-attachments/assets/56dfbae3-5c12-4324-a29a-846d091d9501)
*Missing `s` on the login's password error is fixed.

Important changes in bold.

#### All auth pages
- **Split `/login` into `/signup`**
- UI Redesign that adheres to Figma designs
- General code cleanup and improvements
- Fix feedback: it's now shown when needed and clear (e.g. "~~String~~
Password must be...")
- All action functions use `Sentry.withServerActionInstrumentation`
- `PasswordInput` "eye button" shows password only when mouse button is
hold and doesn't capture tab

#### Login page
- **Removed agree to terms checkbox** (it's only on signup now)
- Move provider login function to `actions.ts`

#### Signup page
- **Requires to type password twice**
- Shows waitlist information on *any* database error

#### Reset password page
- **Password update requires to type password twice**
- **When request to send email is processed then the feedback is:
Password reset email sent if user exists. Please check your email.**
- Email sent feedback is black, error is red
- Move send email and update password functions to `actions.ts`
- Disable button when email is sent

#### Other
- Update zod schema objects and move them to `types/auth`
- Move `components/PasswordInput.tsx` to `/components/auth`
- Make common UI elements separate components in `components/auth`
- Update `yarn.lock` (supabase packages)
- Remove redundant letter in `client.ts`
- Don't log error when user auth is missing in `useSupabase`; user is
simply not logged in

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Form feedback:
    - [x] Login works
    - [x] Signup works
    - [x] Reset email works
    - [x] Change password works
  - [x] Login works
  - [x] Signup works
  - [x] Reset email is sent
  - [x] Reset email logs user in and redirects to `/reset_password`
  - [x] Change password works
  - [x] Logout works
  - [x] All links across auth pages work

Note: OAuth login providers are disabled and so untested.

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-30 18:23:02 +00:00
Swifty 763284e3a3
fix(platform): minor fixes (#9147)
### Changes 🏗️

- Redirect to the marketplace.
- Ensure that the store agent uses agent graph data instead of store
listing data.
- Don’t export agent input values.
- URL sanitization: We can’t open an agent if it has a colon in its
name.
- Show all top agents.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2024-12-30 16:04:35 +01:00
Ethan Lee 10865cd736
[Platform] Creator profile description no longer ignores new lines (#9114)
Fixes #9086

### Changes 🏗️

Added styling to the div that encapsulates the description that takes
white space into account

### Checklist 📋

#### Code changes:
- [ x] I have clearly listed my changes in the PR description
- [ x] I have made a test plan
- [ x] I have tested my changes according to the test plan:

<summary>The test plan was to just make changes to profile bio and check
the creator page to see if new lines were generated properly</summary>
  
  Below is what the new change looks like:
  
  
<img width="882" alt="Screenshot 2024-12-20 at 12 21 09 pm"
src="https://github.com/user-attachments/assets/6d396ec7-96f8-4c9c-9d1f-a5bd75c6dc86"
/>

becomes...

<img width="468" alt="Screenshot 2024-12-20 at 12 21 15 pm"
src="https://github.com/user-attachments/assets/9dbe256b-5800-4f17-91c2-4ecffcffbc0b"
/>
2024-12-21 12:03:49 +00:00
Swifty 1663d4273b
fix(store): username not lowered when its updated (#9112)
fix(store): username not lowered when its updated breaking access to any
of there users pages in the store
2024-12-20 16:43:57 +00:00
Swifty 658493559d
fix(store): Fixing add agent to library (#9098)
Do a deep copy of the store agent so the new agent is under the current
users id

⚠️  Hacky fix!!
2024-12-20 15:04:47 +01:00
Abhimanyu Yadav 6025506cae
feat(store) : add new model and prompt in image generation (#9099)
Update Marketplace Image generation Prompt and Model  

**Changes:**  
- Updated the image generation prompt for Marketplace to better
highlight agent functionality:
  ```
Create a visually engaging app store thumbnail for the AI agent that
highlights what it does in a clear and captivating way:
  - **Name**: {agent.name}  
  - **Description**: {agent.description}  
  Focus on showcasing its core functionality with an appealing design.
  ```  
- Changed the model to `black-forest-labs/flux-1.1-pro` for improved
results.
2024-12-20 15:02:44 +01:00
Aarushi 54f8d3b4dd
blocks(exa): Add more Exa blocks (#9097)
Revamp the Exa search block and add two more for Content and Similarity
search.

### Changes 🏗️

- Updated the exa search block input names to be snakecase not camel
case
- Added Advanced to non required fields
- Pulled Content settings into helpers for reuse across blocks
- Updated customnode.css to handle long inputs, especially in the case
of the date input

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...
2024-12-20 14:57:08 +01:00
Swifty a8339d0748
fix(store): Sanitize username and Agent Name in URLs (#9096)
[fix(store): Sanitize username and Agent Name in
URLs](28b86d4a1f)

---------

Co-authored-by: abhi1992002 <abhimanyu1992002@gmail.com>
2024-12-20 14:14:24 +01:00
Abhimanyu Yadav 4cc8616c02
feat(store) : Small UI changes on marketplace (#9094)
Add small ui changes on marketplace 

Fix
- #9035
- #9034 
- #9033
- #9031 
- #9029
- #9028
- #9027 
- #9026 
- #9025 
- #9021 
- #9020 
- #9004 
- #9003 
- #9002 
- #9001 
- #8999  
- #8998 
- #8997 
- #8996 
- #8995 
- #8994 
- #8993
- #8991   
- #8990 
- #8989

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-12-20 13:01:15 +01:00
Swifty 44722c4b39
fix(store): remove debug logging of requests (#9093)
[fix(store): remove debug logging of
requests](bb13c864f0)

Remove this stuff:

![Screenshot 2024-12-20 at 09 50
26](https://github.com/user-attachments/assets/b178305b-31eb-4571-8762-6ac8f115eb17)
2024-12-20 10:26:31 +01:00
SwiftyOS e33864f5ed fix(store): fmt 2024-12-20 09:42:01 +01:00
Swifty d3e1319eb3
fix(store): Increase the margin below featured section (#9092)
I've increased the margin,  to the requested 60px 

Before:

<img width="1522" alt="91e02ace-920a-4e69-9a12-2c55d9a63ff0"
src="https://github.com/user-attachments/assets/d1163b04-7e80-4ac9-81d5-98f3a7f8a8b9"
/>

Requested:

![Screenshot 2024-12-20 at 09 33
28](https://github.com/user-attachments/assets/f06a2d63-ea4a-435b-b8dd-c8f90ef4bad4)
2024-12-20 09:41:19 +01:00
Nicholas Tindle ddac69e0f1
feat: swap context menu for dropdown to fix three dots doing nothing (#9091)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

swaps context menu for dropdown menu

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-20 09:16:53 +01:00
Aarushi 1fb9c8c37f
fix(store): Display error toast messaging on creator popup (#9078)
When the agent submission was filled out incorrectly, there was no error
pop up. It just did nothing.

### Changes 🏗️

Created an array to track which fields are missing
If this array is not empty, a toast is displayed to show which fields
are missing.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-12-19 22:19:17 +00:00
Aarushi 71310a1b49
fix(frontend): Make clickable area bigger (#9080)
The clickable area on the navbar was very small and the icons were not
clickable

### Changes 🏗️

Wrap the icons as well in Link

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-19 21:51:49 +00:00
Swifty 8e634d7bc3
feat(store): Generate AI images for store submissions (#9090)
Allow generating ai images for store submissions
2024-12-19 22:37:33 +01:00
Swifty d028f5bd39
fix(store): "Publish an Agent" flow has a missing default image (#9089)
Introduced when making sure initialData was inferred from the agent
object. This has been fixed now
2024-12-19 18:22:04 +00:00
Swifty ca91754bc6
fix(store): Make username case insensitive (#9088)
Username was case sensitive, made username case insensitive
2024-12-19 18:13:28 +00:00
Krzysztof Czerwinski 8ca80e05a9
fix(frontend): Disable agent save button when saving or running (#9077)
Now agent can be saved multiple times.

### Changes 🏗️

Disable agent save button when saving or running.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-19 13:42:04 +00:00
Bently 4f15da99f9
feat(frontend) Remove "credentials" from export & import of agents (#9081)
### Changes 🏗️

This is for [Credential ID Exports into Agent JSON #8919
](https://github.com/Significant-Gravitas/AutoGPT/issues/8919)

I have added a new function ``removeCredentials`` into
[``utils.ts``](https://github.com/Significant-Gravitas/AutoGPT/compare/dev...bently/open-2153-credential-id-exports-into-agent-json?expand=1#diff-db26a69e6fb7546dc621634f3c8ee6efa3639e72e02837f753af18b2fdddf7be)
which will go through and look for any "credentials" that are in the
JSON during a agent export, this will then remove them and let the user
download the file.

I have also added the same function to the importing of agents for old
agents that where exported that still contain the credentials, this
means that old agents can be imported with out breaking/causing issues.

When I say it looks for credentials I dont mean actual credentials like
api keys them self, in the JSON that is exported it contains the
following, this needs removing
```
"credentials": {
  "id": "6767232a-3407-4c34-85a3-6887d4969f0c",
  "title": "Anthropic Toran",
  "provider": "anthropic",
  "type": "api_key"
},
```

If there is a better way to go about this let me know!
2024-12-19 13:20:35 +00:00
Swifty 54dddbf488
feat(store): Auto-populate the agent submission form (#9074)
### Changes 🏗️

- added description to my agents response
- auto populate the publish agent info form


https://github.com/user-attachments/assets/68cd5d33-0f67-4875-80e9-5a7115b847e7
2024-12-19 11:23:31 +00:00
Bently 356aee1b72
fix(store): Marketplace - "Integrations" link in Settings is a 404 (#9073)
I copied the original integrations page into
``/store/(users)/integrations`` and did some slight tweaks and I updated
the url path from ``/integrations`` to ``/store/integrations`` in
``Sidebar.tsx`` so the button to the integrations page works now

This also replaces
https://github.com/Significant-Gravitas/AutoGPT/pull/9072



https://github.com/user-attachments/assets/e1ff6fd6-e47a-49b6-82d5-e6fc55eb07b5
2024-12-19 11:18:56 +00:00
Swifty ed7c9378eb
fix(store): Marketplace - Navbar should say "Marketplace" rather than "Agent Store" (#9069)
Fixes #9067 

### Changes 🏗️

- Renamed elements from Agent Store to Marketplace
2024-12-19 10:20:02 +00:00
Swifty aaf4ee524d
fix(store): Youtube link not showing video on agent page (#9068)
Fixes #9054 

## Changes

- add the video link to the start of the images array
2024-12-19 11:19:37 +01:00
SwiftyOS 234e4a35c4 fix(store): Profile updating is handled in an insecure and potentially broken way 2024-12-19 10:47:10 +01:00
Swifty 4646de463a
fix(store): Uploading to store selects two agents (#9065)
Fixes #9059 

### Changes 🏗️

- Changed agent selection from keying on agent name to keying on agent
id
2024-12-19 08:51:01 +00:00
Bently b1d869aad2
feat(frontend): Disable theme toggle (#9062)
### Changes 🏗️

This disables the theme toggle for now, I did not remove it incase we
plan to properly add it back in the future
2024-12-18 21:45:03 +00:00
Nicholas Tindle bb8a37911c
feat: default for is featured (#9061)
<!-- Clearly explain the need for these changes: -->
defaults is_featured to false
2024-12-18 20:17:04 +00:00
Nicholas Tindle 746f3d4e41
feat(platform): Support manually setting up webhooks (#8750)
- Resolves #8748

The webhooks system as is works really well for full blown enterprise
webhooks managed via a UI. It does not work for more "chill guy" webhook
tools that just send notifications sometimes.

## Changes 🏗️

- feat(blocks): Add Compass transcription trigger block

- feat(backend): Amend webhooks system to support manual-set-up webhooks
   - Make event filter input optional on webhook-triggered nodes
   - Make credentials optional on webhook-triggered nodes
   - Add code path to re-use existing manual webhook on graph update
   - Add `ManualWebhookManagerBase`

- feat(frontend): Add UI to pass webhook URL to user on manual-set-up
webhook blocks

![image](https://github.com/user-attachments/assets/1c35f161-7fe4-4916-8506-5ca9a838f398)

- fix(backend): Strip webhook info from node objects for graph export

- refactor(backend): Rename `backend.integrations.webhooks.base` to
`._base`

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-18 19:24:34 +00:00
SwiftyOS 89a9354acb fix(store): isFeatured used instead of is_featured 2024-12-18 17:06:09 +01:00
Swifty aa883d8465
feat(platform): updated schema to allow featuring of specific creators (#9048)
updated schema to allow featuring of specific creators
2024-12-18 13:32:03 +00:00
Bently e8dd0a297e
feat(frontend): Updates to navbar (#9047)
### Changes 🏗️

Updates to navbar and button sizes, added autogpt icon

The navbar now matches the design and resolves [Markeplace - Reduce the
size of the top menu bar, change the font size & the height of the bar
to 64px
#8953](https://github.com/Significant-Gravitas/AutoGPT/issues/8953)


![image](https://github.com/user-attachments/assets/d8b7cfdd-6e57-4f71-bae5-c2b51bfa63f3)

![image](https://github.com/user-attachments/assets/b908a28f-c325-44df-80e4-84f6eca2ddd5)

![image](https://github.com/user-attachments/assets/b4324590-bf27-4fd5-97e2-c7e6047dda15)
2024-12-18 13:28:00 +00:00
Swifty 9d93704264
feat(platform): Add basic library functionality (#9043)
Add functionality to allow users to add agents to their library from the
store page.
2024-12-18 14:01:48 +01:00
Krzysztof Czerwinski 6ec2bacb72
refactor(frontend): Update Supabase and backend API management (#9036)
Currently there are random issues (logout, auth desync) and
inconveniences with how Supabase and backend API works.
Resolves:
- https://github.com/Significant-Gravitas/AutoGPT/issues/9006
- https://github.com/Significant-Gravitas/AutoGPT/issues/8912

### Changes 🏗️

This PR streamlines how the Supabase and backend API is used to fix
current errors with auth, remove unnecessary code and make it easier to
use Supabase and backend API.

- Add `getServerSupabase` for server side that returns `SupabaseClient`.
- Add `Spinner` component that is used for loading animation.
- Remove redundant `useUser`, user is fetched in `useSupabase` already.
- Replace most Supabase `create*Client` to `getSupabaseServer` and
`useSupabase`.
- Remove redundant `AutoGPTServerAPI` class and rename
`BaseAutoGPTServerAPI` to `BackendAPI` and use it instead.
- Remove `SupabaseProvider` context; supabase caches internally what's
possible already.
- Move `useSupabase` hook to its own file and update it.

### Helpful table
| Next.js usage | Server | Client |
|---|---|---|
| API | `new BackendAPI();` | `new BackendAPI();`* or `useBackendAPI()`
|
| Supabase | `getServerSupabase();` | `useSupabase();` |
| user, user.role | `getServerUser();`** | `useSupabase();` |

\* `BackendAPI` automatically chooses correct Supabase client, so while
it's recommended to use `useBackendAPI()`, it's ok to use `new
BackendAPI();` in client components and even memoize it: `useMemo(() =>
new BackendAPI(), [])`.

** The reason user isn't returned in `getServerSupabase` is because it
forces async fetch but creating supabase doesn't, so it'd force
`getServerSupabase` to be async or return `{ supabase: SupabaseClient,
user: Promise<User> | null }`. For the same reason `useSupabase`
provides access to `supabase` immediately but `user` *may* be loading,
so there's `isUserLoading` provided as well.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-18 09:55:23 +00:00
Bently 95bd268de8
feat(frontend): search results updates (#9024)
This PR covers all of these issues below related to the search bar
section

- [Marketplace - search results - change margins between chips and
section title
#8980](https://github.com/Significant-Gravitas/AutoGPT/issues/8980)
- [Marketplace - search results - #8981
](https://github.com/Significant-Gravitas/AutoGPT/issues/8981)
- [Marketplace - search results - search box reduce height to 60px
#8977](https://github.com/Significant-Gravitas/AutoGPT/issues/8977)
- [Marketplace - search results - increase margins between filter chips
and search box
#8978](https://github.com/Significant-Gravitas/AutoGPT/issues/8978)
- [Marketplace - search results - change line height
#8979](https://github.com/Significant-Gravitas/AutoGPT/issues/8979)

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-18 09:25:31 +01:00
Krzysztof Czerwinski 0e10e62bfa
feat(frontend): Reset password page (#8987)
Currently, users have no way to reset their password.

### Changes 🏗️

Add `reset_password` page that displays either form to send reset
password email or lets logged in user change their password. Login page
now shows clickable "Forgot your password?" link. After updating
password user is logged out and redirected to login page.

Note: Link provided in the email just logs user in and redirects to
reset password form but password update isn't enforced.

<img width="279" alt="Screenshot 2024-12-14 at 1 28 39 PM"
src="https://github.com/user-attachments/assets/c7ada10c-74e5-4be3-8033-0912eb5b38f2"
/>

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Email is sent
- [x] Link in the email logs user in and redirects to reset password
form
  - [x] Reset password form works

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-17 18:30:18 +00:00
SwiftyOS 4d19bcdc5e Merge branch 'dev' of github.com:Significant-Gravitas/AutoGPT into dev 2024-12-17 14:19:53 +01:00
SwiftyOS e27d7a2efb revert upgrade of crypto lib 2024-12-17 14:19:41 +01:00
Reinier van der Leer a386b3ac90
fix(forge): Update browser extension download URL (#9032)
- Resolves #9030
2024-12-17 11:26:39 +00:00
SwiftyOS 41be88f0bf Update dependencies 2024-12-17 11:07:13 +01:00
dependabot[bot] 53eda98737
build(deps-dev): bump the development-dependencies group across 1 directory with 3 updates (#9017)
Bumps the development-dependencies group with 3 updates in the
/autogpt_platform/frontend directory:
[@storybook/test-runner](https://github.com/storybookjs/test-runner),
[eslint-config-next](https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next)
and [msw](https://github.com/mswjs/msw).

Updates `@storybook/test-runner` from 0.19.1 to 0.20.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/test-runner/releases"><code>@​storybook/test-runner</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v0.20.1</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Release 0.20.1 <a
href="https://redirect.github.com/storybookjs/test-runner/pull/520">#520</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Fix postVisit hook issue <a
href="https://redirect.github.com/storybookjs/test-runner/pull/519">#519</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h2>v0.20.0</h2>
<h3>Release Notes</h3>
<h4>Refactor: Align with Storybook 8.2 core package layout</h4>
<p>This is a structural change that shouldn't really affect you. As long
as you have the <code>storybook</code> dependency in your app (which you
should), you're good! This change makes it so that the test-runner
deduplicates Storybook dependencies, and therefore, slims down your
node_modules size.</p>
<h4>Feature: Run postVisit on failures (<a
href="https://redirect.github.com/storybookjs/test-runner/pull/494">#494</a>)</h4>
<p>The test-runner's postVisit hook now runs even if there are failures.
This allows you to, for instance, take snapshots on component failures.
You can check whether the test has failed via the
<code>hasFailure</code> property in the context passed to the hook:</p>
<pre lang="ts"><code>const config: TestRunnerConfig = {
  async postVisit(_page, context) {
    if(context.hasFailure) {
      console.log('problems!')
      // do a snapshot, write a log, or anything you like
    }
  },
}
</code></pre>
<hr />
<h4>🚀 Enhancement</h4>
<ul>
<li>Feature: Run postVisit on failures <a
href="https://redirect.github.com/storybookjs/test-runner/pull/494">#494</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Align with Storybook 8.2 core package layout <a
href="https://redirect.github.com/storybookjs/test-runner/pull/512">#512</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li>Fix tags docs <a
href="https://redirect.github.com/storybookjs/test-runner/pull/497">#497</a>
(<a href="https://github.com/shilman"><code>@​shilman</code></a> <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 6</h4>
<ul>
<li>Michael Shilman (<a
href="https://github.com/shilman"><code>@​shilman</code></a>)</li>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h2>v0.20.0-next.2</h2>
<h3>Release Notes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/test-runner/blob/v0.20.1/CHANGELOG.md"><code>@​storybook/test-runner</code>'s
changelog</a>.</em></p>
<blockquote>
<h1>v0.20.1 (Mon Dec 02 2024)</h1>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Release 0.20.1 <a
href="https://redirect.github.com/storybookjs/test-runner/pull/520">#520</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Fix postVisit hook issue <a
href="https://redirect.github.com/storybookjs/test-runner/pull/519">#519</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Yann Braga (<a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<hr />
<h1>v0.20.0 (Thu Nov 28 2024)</h1>
<h3>Release Notes</h3>
<h4>Feature: Run postVisit on failures (<a
href="https://redirect.github.com/storybookjs/test-runner/pull/494">#494</a>)</h4>
<p>The test-runner's postVisit hook now runs even if there are failures.
This allows you to, for instance, take snapshots on component failures.
You can check whether the test has failed via the
<code>hasFailure</code> property in the context passed to the hook:</p>
<pre lang="ts"><code>const config: TestRunnerConfig = {
  async postVisit(_page, context) {
    if(context.hasFailure) {
      console.log('problems!')
      // do a snapshot, write a log, or anything you like
    }
  },
}
</code></pre>
<hr />
<h4>🚀 Enhancement</h4>
<ul>
<li>Release 0.20.0 <a
href="https://redirect.github.com/storybookjs/test-runner/pull/518">#518</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a> <a
href="https://github.com/shilman"><code>@​shilman</code></a>)</li>
<li>Feature: Run postVisit on failures <a
href="https://redirect.github.com/storybookjs/test-runner/pull/494">#494</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
<li>Release 0.20.0 <a
href="https://redirect.github.com/storybookjs/test-runner/pull/514">#514</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a> <a
href="mailto:runner@fv-az773-358.an51pne1gm2ejjnmkprpigk40g.dx.internal.cloudapp.net">runner@fv-az773-358.an51pne1gm2ejjnmkprpigk40g.dx.internal.cloudapp.net</a>)</li>
<li>Align with Storybook 8.2 core package layout <a
href="https://redirect.github.com/storybookjs/test-runner/pull/512">#512</a>
(<a href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li>Fix tags docs <a
href="https://redirect.github.com/storybookjs/test-runner/pull/497">#497</a>
(<a href="https://github.com/shilman"><code>@​shilman</code></a> <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>)</li>
</ul>
<h4>Authors: 6</h4>
<ul>
<li>Michael Shilman (<a
href="https://github.com/shilman"><code>@​shilman</code></a>)</li>
<li>shilman (<a
href="mailto:runner@fv-az1567-4.ivwpl3vsblrubjity54i0equac.phxx.internal.cloudapp.net">runner@fv-az1567-4.ivwpl3vsblrubjity54i0equac.phxx.internal.cloudapp.net</a>)</li>
<li>shilman (<a
href="mailto:runner@fv-az2031-358.rag0t2s20xiu3oejmeweyzhkrf.bx.internal.cloudapp.net">runner@fv-az2031-358.rag0t2s20xiu3oejmeweyzhkrf.bx.internal.cloudapp.net</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="055cca406a"><code>055cca4</code></a>
Bump version to: 0.20.1 [skip ci]</li>
<li><a
href="6efa9a391c"><code>6efa9a3</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="7bc1bcf159"><code>7bc1bcf</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/520">#520</a>
from storybookjs/release/v0.20.1</li>
<li><a
href="faeafe7696"><code>faeafe7</code></a>
Merge branch 'main' into release/v0.20.1</li>
<li><a
href="eb1a945c1b"><code>eb1a945</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/519">#519</a>
from storybookjs/yann/fix-hooks-issue</li>
<li><a
href="6bff30ef83"><code>6bff30e</code></a>
fix postVisit hook issue</li>
<li><a
href="8dabc7addb"><code>8dabc7a</code></a>
Bump version to: 0.20.0 [skip ci]</li>
<li><a
href="c1571a8c6d"><code>c1571a8</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="dc1e287b7e"><code>dc1e287</code></a>
Merge pull request <a
href="https://redirect.github.com/storybookjs/test-runner/issues/518">#518</a>
from storybookjs/release/v0.20.0</li>
<li><a
href="0129a50faa"><code>0129a50</code></a>
bring back main deps</li>
<li>Additional commits viewable in <a
href="https://github.com/storybookjs/test-runner/compare/v0.19.1...v0.20.1">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~storybook-bot">storybook-bot</a>, a new
releaser for <code>@​storybook/test-runner</code> since your current
version.</p>
</details>
<br />

Updates `eslint-config-next` from 15.0.3 to 15.1.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases">eslint-config-next's
releases</a>.</em></p>
<blockquote>
<h2>v15.1.0</h2>
<h3>Core Changes</h3>
<ul>
<li>fix: decrypt bound args before generating a cache key: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72463">#72463</a></li>
<li>Fix the path to the next/experimental/testing/server export: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72527">#72527</a></li>
<li>Expand <code>server-source-maps</code> scenarios to cover Edge
runtime: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72288">#72288</a></li>
<li>Ensure logged errors in Edge runtime include the stack: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72394">#72394</a></li>
<li>fix: added cache control headers for static app routes: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72521">#72521</a></li>
<li>capture console issues as console errors: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72468">#72468</a></li>
<li>Add expireTag and expirePath APIs: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72485">#72485</a></li>
<li>fix: try/catch access to localStorage within
__NEXT_APP_ISR_INDICATOR useEffect: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72362">#72362</a></li>
<li>Move client build ID to a global variable: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72592">#72592</a></li>
<li>refactor(turbopack): Remove <code>swc_css</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72602">#72602</a></li>
<li>Bypass source map dev middleware for client chunks: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72581">#72581</a></li>
<li>chore: remove <code>rc</code> from URL: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72599">#72599</a></li>
<li>improve <code>no-img-element</code> lint error message: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72410">#72410</a></li>
<li>Combine bound <code>&quot;use cache&quot;</code> closure args into a
single parameter: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72587">#72587</a></li>
<li>[Turbopack] add BackendOptions and allow to disable dependencies,
children and storage: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72426">#72426</a></li>
<li>Omit unused args when calling <code>&quot;use cache&quot;</code>
functions: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72506">#72506</a></li>
<li>Add experimental <code>clientSegmentCache</code> flag: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72626">#72626</a></li>
<li>Add <code>compiler.define</code> option: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/71802">#71802</a></li>
<li>Fix static indicator with dynamicIO: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72631">#72631</a></li>
<li>Allow usage of Node.js prereleases: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72635">#72635</a></li>
<li>improved network url in (dev) cli: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72634">#72634</a></li>
<li>chore: update <code>getting-started/react-essentials</code> path: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72250">#72250</a></li>
<li>Fix static indicator for pure IO case: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72639">#72639</a></li>
<li>Bump the monorepo packages TypeScript to <code>5.6.3</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72625">#72625</a></li>
<li>Bump <code>@capsizecss/metrics</code> to 3.4.0 for Geist Google
Font: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72746">#72746</a></li>
<li>refactor: remove unused asNotFound property: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72585">#72585</a></li>
<li>Remove unused <code>enabled</code> config from server actions
transforms: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72755">#72755</a></li>
<li>Ensure Next.js is ignore-listed when used as external: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72498">#72498</a></li>
<li>Bump <code>eslint-plugin-react</code> to 7.37.0: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72759">#72759</a></li>
<li>upgrade amphtml-validator to 1.0.38: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72645">#72645</a></li>
<li>fix multi-level redirect in server actions: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72770">#72770</a></li>
<li>refactor: rename error boundary not-found to http-error-fallback: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72586">#72586</a></li>
<li>Upgrade React from <code>5c56b873-20241107</code> to
<code>7ac8e612-20241113</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72768">#72768</a></li>
<li>Re-use randomly selected dev server port for automatic restarts: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72771">#72771</a></li>
<li>Emit build error when <code>&quot;use cache&quot;</code> is used
without <code>dynamicIO</code> enabled: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72781">#72781</a></li>
<li>fix: not found bounary prop: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72784">#72784</a></li>
<li>silence sass <code>legacy-js-api</code> warning: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72632">#72632</a></li>
<li>[Segment Prefetch] Move access token to route tree: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72775">#72775</a></li>
<li>Add internal affordances to show ignore-listed stackframes in
terminal: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72763">#72763</a></li>
<li>chore(turbopack): Centralize reqwest TLS feature configs in
turbo-tasks-fetch: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72526">#72526</a></li>
<li>Upgrade React from <code>7ac8e612-20241113</code> to
<code>380f5d67-20241113</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72819">#72819</a></li>
<li>Shorten unsourcemapped absolute locations in terminal stacktraces:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72764">#72764</a></li>
<li>codemod: replace <code>revalidate(Tag|Path)</code> to
<code>expire(Tag|Path)</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72826">#72826</a></li>
<li>&quot;Fix&quot;: Lift type check out of loop: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72840">#72840</a></li>
<li>hide stack trace in CanaryOnlyError: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72859">#72859</a></li>
<li>Allow missing CacheNodeSeedData during prefetch: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72857">#72857</a></li>
<li>Add Segment Cache feature check to <code>prefetch</code> API: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/72861">#72861</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="dafcd43fac"><code>dafcd43</code></a>
v15.1.0</li>
<li><a
href="2deb35d487"><code>2deb35d</code></a>
v15.0.4-canary.52</li>
<li><a
href="3970d33e6d"><code>3970d33</code></a>
v15.0.4-canary.51</li>
<li><a
href="c824c183d0"><code>c824c18</code></a>
v15.0.4-canary.50</li>
<li><a
href="657c2cbd72"><code>657c2cb</code></a>
v15.0.4-canary.49</li>
<li><a
href="c2078d0c05"><code>c2078d0</code></a>
v15.0.4-canary.48</li>
<li><a
href="6b9baaace0"><code>6b9baaa</code></a>
v15.0.4-canary.47</li>
<li><a
href="2caf05122a"><code>2caf051</code></a>
v15.0.4-canary.46</li>
<li><a
href="a6da830de8"><code>a6da830</code></a>
v15.0.4-canary.45</li>
<li><a
href="85062ae41e"><code>85062ae</code></a>
v15.0.4-canary.44</li>
<li>Additional commits viewable in <a
href="https://github.com/vercel/next.js/commits/v15.1.0/packages/eslint-config-next">compare
view</a></li>
</ul>
</details>
<br />

Updates `msw` from 2.6.8 to 2.7.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/mswjs/msw/releases">msw's
releases</a>.</em></p>
<blockquote>
<h2>v2.7.0 (2024-12-17)</h2>
<h3>Features</h3>
<ul>
<li>use <code>picocolors</code> instead of <code>chalk</code> (<a
href="https://redirect.github.com/mswjs/msw/issues/2377">#2377</a>)
(85bdd82dfe4cd3d514d7820dad3338b485084fbf) <a
href="https://github.com/Namchee"><code>@​Namchee</code></a> <a
href="https://github.com/kettanaito"><code>@​kettanaito</code></a></li>
</ul>
<h2>v2.6.9 (2024-12-16)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>support <code>SharedArrayBuffer</code> in
<code>HttpResponse.arrayBuffer</code> (<a
href="https://redirect.github.com/mswjs/msw/issues/2389">#2389</a>)
(41f00e1a67e21010ab9c1a46c8e92193b655f24a) <a
href="https://github.com/danilofuchs"><code>@​danilofuchs</code></a> <a
href="https://github.com/kettanaito"><code>@​kettanaito</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e3234fdce5"><code>e3234fd</code></a>
chore(release): v2.7.0</li>
<li><a
href="85bdd82dfe"><code>85bdd82</code></a>
feat: use <code>picocolors</code> instead of <code>chalk</code> (<a
href="https://redirect.github.com/mswjs/msw/issues/2377">#2377</a>)</li>
<li><a
href="58f2d2cfa5"><code>58f2d2c</code></a>
chore: use v18 in .nvmrc</li>
<li><a
href="a845ea1e2f"><code>a845ea1</code></a>
chore(release): v2.6.9</li>
<li><a
href="417918fce0"><code>417918f</code></a>
chore: pack/build before all e2e tests (<a
href="https://redirect.github.com/mswjs/msw/issues/2395">#2395</a>)</li>
<li><a
href="41f00e1a67"><code>41f00e1</code></a>
fix: support <code>SharedArrayBuffer</code> in
<code>HttpResponse.arrayBuffer</code> (<a
href="https://redirect.github.com/mswjs/msw/issues/2389">#2389</a>)</li>
<li><a
href="a6c419c877"><code>a6c419c</code></a>
chore: split build into standalone jobs (<a
href="https://redirect.github.com/mswjs/msw/issues/2386">#2386</a>)</li>
<li>See full diff in <a
href="https://github.com/mswjs/msw/compare/v2.6.8...v2.7.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-17 08:58:06 +00:00
Nicholas Tindle abd245cb2b
test(frontend): additional build page automation tooling (#8951)
The tutorial was a bit harder than we expected to completely automate.
Along the way though, we made these functions so lets keep em in for
future use
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️
- Adds a few more functions for the build automation pages

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan: Writing tests
2024-12-17 08:55:02 +00:00
dependabot[bot] 9e0c296aef
build(deps): bump uvicorn from 0.32.1 to 0.34.0 in /autogpt_platform/market in the production-dependencies group (#9012)
Bumps the production-dependencies group in /autogpt_platform/market with
1 update: [uvicorn](https://github.com/encode/uvicorn).

Updates `uvicorn` from 0.32.1 to 0.34.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/encode/uvicorn/releases">uvicorn's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.34.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Add <code>content-length</code> to 500 response in wsproto by <a
href="https://github.com/Kludex"><code>@​Kludex</code></a> in <a
href="https://redirect.github.com/encode/uvicorn/pull/2542">encode/uvicorn#2542</a></li>
<li>Drop Python 3.8 by <a
href="https://github.com/Kludex"><code>@​Kludex</code></a> in <a
href="https://redirect.github.com/encode/uvicorn/pull/2543">encode/uvicorn#2543</a></li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/encode/uvicorn/compare/0.33.0...0.34.0">https://github.com/encode/uvicorn/compare/0.33.0...0.34.0</a></p>
<h2>Version 0.33.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Remove WatchGod by <a
href="https://github.com/Kludex"><code>@​Kludex</code></a> in <a
href="https://redirect.github.com/encode/uvicorn/pull/2536">encode/uvicorn#2536</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/bwells"><code>@​bwells</code></a> made
their first contribution in <a
href="https://redirect.github.com/encode/uvicorn/pull/2491">encode/uvicorn#2491</a></li>
<li><a href="https://github.com/tback"><code>@​tback</code></a> made
their first contribution in <a
href="https://redirect.github.com/encode/uvicorn/pull/2528">encode/uvicorn#2528</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/encode/uvicorn/compare/0.32.1...0.33.0">https://github.com/encode/uvicorn/compare/0.32.1...0.33.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/encode/uvicorn/blob/master/CHANGELOG.md">uvicorn's
changelog</a>.</em></p>
<blockquote>
<h2>0.34.0 (2024-12-15)</h2>
<h3>Added</h3>
<ul>
<li>Add <code>content-length</code> to 500 response in
<code>wsproto</code> implementation (<a
href="https://redirect.github.com/encode/uvicorn/issues/2542">#2542</a>)</li>
</ul>
<h3>Removed</h3>
<ul>
<li>Drop support for Python 3.8 (<a
href="https://redirect.github.com/encode/uvicorn/issues/2543">#2543</a>)</li>
</ul>
<h2>0.33.0 (2024-12-14)</h2>
<h3>Removed</h3>
<ul>
<li>Remove <code>WatchGod</code> support for <code>--reload</code> (<a
href="https://redirect.github.com/encode/uvicorn/issues/2536">#2536</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7983c1ae9c"><code>7983c1a</code></a>
Version 0.34.0 (<a
href="https://redirect.github.com/encode/uvicorn/issues/2546">#2546</a>)</li>
<li><a
href="75d4402f32"><code>75d4402</code></a>
Add alls-green job (<a
href="https://redirect.github.com/encode/uvicorn/issues/2544">#2544</a>)</li>
<li><a
href="4156ccb4c9"><code>4156ccb</code></a>
Drop Python 3.8 (<a
href="https://redirect.github.com/encode/uvicorn/issues/2543">#2543</a>)</li>
<li><a
href="3575cbaa4e"><code>3575cba</code></a>
Add <code>content-length</code> to 500 response in wsproto (<a
href="https://redirect.github.com/encode/uvicorn/issues/2542">#2542</a>)</li>
<li><a
href="a500513085"><code>a500513</code></a>
Version 0.33.0 (<a
href="https://redirect.github.com/encode/uvicorn/issues/2539">#2539</a>)</li>
<li><a
href="038f8ef3fe"><code>038f8ef</code></a>
Bump the python-packages group across 1 directory with 9 updates (<a
href="https://redirect.github.com/encode/uvicorn/issues/2538">#2538</a>)</li>
<li><a
href="3aa1d010d6"><code>3aa1d01</code></a>
Remove WatchGod (<a
href="https://redirect.github.com/encode/uvicorn/issues/2536">#2536</a>)</li>
<li><a
href="a3cc36016e"><code>a3cc360</code></a>
docs: add note about server behavior on exceptions (<a
href="https://redirect.github.com/encode/uvicorn/issues/2535">#2535</a>)</li>
<li><a
href="6725ebb1ee"><code>6725ebb</code></a>
docs: add more mkdocs-material features (<a
href="https://redirect.github.com/encode/uvicorn/issues/2534">#2534</a>)</li>
<li><a
href="bfa754e21e"><code>bfa754e</code></a><code>encode/uvicorn#2527</code><a
href="https://redirect.github.com/encode/uvicorn/issues/2528">#2528</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/encode/uvicorn/compare/0.32.1...0.34.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=uvicorn&package-manager=pip&previous-version=0.32.1&new-version=0.34.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-17 08:50:04 +00:00
dependabot[bot] 59f52fb656
build(deps-dev): bump ruff from 0.8.2 to 0.8.3 in /autogpt_platform/autogpt_libs in the development-dependencies group (#9008)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 1 update:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.8.2 to 0.8.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.3</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>Fix fstring formatting removing overlong implicit concatenated
string in expression part (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14811">#14811</a>)</li>
<li>[<code>airflow</code>] Add fix to remove deprecated keyword
arguments (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14887">#14887</a>)</li>
<li>[<code>airflow</code>]: Extend rule to include deprecated names for
Airflow 3.0 (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14765">#14765</a>
and <a
href="https://redirect.github.com/astral-sh/ruff/pull/14804">#14804</a>)</li>
<li>[<code>flake8-bugbear</code>] Improve error messages for
<code>except*</code> (<code>B025</code>, <code>B029</code>,
<code>B030</code>, <code>B904</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14815">#14815</a>)</li>
<li>[<code>flake8-bugbear</code>] <code>itertools.batched()</code>
without explicit <code>strict</code> (<code>B911</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14408">#14408</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Dotless suffix passed to
<code>Path.with_suffix()</code> (<code>PTH210</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14779">#14779</a>)</li>
<li>[<code>pylint</code>] Include parentheses and multiple comparators
in check for <code>boolean-chained-comparison</code>
(<code>PLR1716</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14781">#14781</a>)</li>
<li>[<code>ruff</code>] Do not simplify <code>round()</code> calls
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14832">#14832</a>)</li>
<li>[<code>ruff</code>] Don't emit <code>used-dummy-variable</code> on
function parameters (<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14818">#14818</a>)</li>
<li>[<code>ruff</code>] Implement <code>if-key-in-dict-del</code>
(<code>RUF051</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14553">#14553</a>)</li>
<li>[<code>ruff</code>] Mark autofix for <code>RUF052</code> as always
unsafe (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14824">#14824</a>)</li>
<li>[<code>ruff</code>] Teach autofix for
<code>used-dummy-variable</code> about TypeVars etc.
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14819">#14819</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Offer unsafe autofix for
<code>no-explicit-stacklevel</code> (<code>B028</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14829">#14829</a>)</li>
<li>[<code>flake8-pyi</code>] Skip all type definitions in
<code>string-or-bytes-too-long</code> (<code>PYI053</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14797">#14797</a>)</li>
<li>[<code>pyupgrade</code>] Do not report when a UTF-8 comment is
followed by a non-UTF-8 one (<code>UP009</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14728">#14728</a>)</li>
<li>[<code>pyupgrade</code>] Mark fixes for
<code>convert-typed-dict-functional-to-class</code> and
<code>convert-named-tuple-functional-to-class</code> as unsafe if they
will remove comments (<code>UP013</code>, <code>UP014</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14842">#14842</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Raise syntax error for mixing <code>except</code> and
<code>except*</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14895">#14895</a>)</li>
<li>[<code>flake8-bugbear</code>] Fix <code>B028</code> to allow
<code>stacklevel</code> to be explicitly assigned as a positional
argument (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14868">#14868</a>)</li>
<li>[<code>flake8-bugbear</code>] Skip <code>B028</code> if
<code>warnings.warn</code> is called with <code>*args</code> or
<code>**kwargs</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14870">#14870</a>)</li>
<li>[<code>flake8-comprehensions</code>] Skip iterables with named
expressions in <code>unnecessary-map</code> (<code>C417</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14827">#14827</a>)</li>
<li>[<code>flake8-pyi</code>] Also remove <code>self</code> and
<code>cls</code>'s annotation (<code>PYI034</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14801">#14801</a>)</li>
<li>[<code>flake8-pytest-style</code>] Fix
<code>pytest-parametrize-names-wrong-type</code> (<code>PT006</code>) to
edit both <code>argnames</code> and <code>argvalues</code> if both of
them are single-element tuples/lists (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14699">#14699</a>)</li>
<li>[<code>perflint</code>] Improve autofix for <code>PERF401</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14369">#14369</a>)</li>
<li>[<code>pylint</code>] Fix <code>PLW1508</code> false positive for
default string created via a mult operation (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14841">#14841</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/BurntSushi"><code>@​BurntSushi</code></a></li>
<li><a
href="https://github.com/DimitriPapadopoulos"><code>@​DimitriPapadopoulos</code></a></li>
<li><a
href="https://github.com/Glyphack"><code>@​Glyphack</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/UnknownPlatypus"><code>@​UnknownPlatypus</code></a></li>
<li><a href="https://github.com/carljm"><code>@​carljm</code></a></li>
<li><a href="https://github.com/cclauss"><code>@​cclauss</code></a></li>
<li><a
href="https://github.com/dcreager"><code>@​dcreager</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
<li><a href="https://github.com/dylwil3"><code>@​dylwil3</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.3</h2>
<h3>Preview features</h3>
<ul>
<li>Fix fstring formatting removing overlong implicit concatenated
string in expression part (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14811">#14811</a>)</li>
<li>[<code>airflow</code>] Add fix to remove deprecated keyword
arguments (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14887">#14887</a>)</li>
<li>[<code>airflow</code>]: Extend rule to include deprecated names for
Airflow 3.0 (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14765">#14765</a>
and <a
href="https://redirect.github.com/astral-sh/ruff/pull/14804">#14804</a>)</li>
<li>[<code>flake8-bugbear</code>] Improve error messages for
<code>except*</code> (<code>B025</code>, <code>B029</code>,
<code>B030</code>, <code>B904</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14815">#14815</a>)</li>
<li>[<code>flake8-bugbear</code>] <code>itertools.batched()</code>
without explicit <code>strict</code> (<code>B911</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14408">#14408</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Dotless suffix passed to
<code>Path.with_suffix()</code> (<code>PTH210</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14779">#14779</a>)</li>
<li>[<code>pylint</code>] Include parentheses and multiple comparators
in check for <code>boolean-chained-comparison</code>
(<code>PLR1716</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14781">#14781</a>)</li>
<li>[<code>ruff</code>] Do not simplify <code>round()</code> calls
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14832">#14832</a>)</li>
<li>[<code>ruff</code>] Don't emit <code>used-dummy-variable</code> on
function parameters (<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14818">#14818</a>)</li>
<li>[<code>ruff</code>] Implement <code>if-key-in-dict-del</code>
(<code>RUF051</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14553">#14553</a>)</li>
<li>[<code>ruff</code>] Mark autofix for <code>RUF052</code> as always
unsafe (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14824">#14824</a>)</li>
<li>[<code>ruff</code>] Teach autofix for
<code>used-dummy-variable</code> about TypeVars etc.
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14819">#14819</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Offer unsafe autofix for
<code>no-explicit-stacklevel</code> (<code>B028</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14829">#14829</a>)</li>
<li>[<code>flake8-pyi</code>] Skip all type definitions in
<code>string-or-bytes-too-long</code> (<code>PYI053</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14797">#14797</a>)</li>
<li>[<code>pyupgrade</code>] Do not report when a UTF-8 comment is
followed by a non-UTF-8 one (<code>UP009</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14728">#14728</a>)</li>
<li>[<code>pyupgrade</code>] Mark fixes for
<code>convert-typed-dict-functional-to-class</code> and
<code>convert-named-tuple-functional-to-class</code> as unsafe if they
will remove comments (<code>UP013</code>, <code>UP014</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14842">#14842</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Raise syntax error for mixing <code>except</code> and
<code>except*</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14895">#14895</a>)</li>
<li>[<code>flake8-bugbear</code>] Fix <code>B028</code> to allow
<code>stacklevel</code> to be explicitly assigned as a positional
argument (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14868">#14868</a>)</li>
<li>[<code>flake8-bugbear</code>] Skip <code>B028</code> if
<code>warnings.warn</code> is called with <code>*args</code> or
<code>**kwargs</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14870">#14870</a>)</li>
<li>[<code>flake8-comprehensions</code>] Skip iterables with named
expressions in <code>unnecessary-map</code> (<code>C417</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14827">#14827</a>)</li>
<li>[<code>flake8-pyi</code>] Also remove <code>self</code> and
<code>cls</code>'s annotation (<code>PYI034</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14801">#14801</a>)</li>
<li>[<code>flake8-pytest-style</code>] Fix
<code>pytest-parametrize-names-wrong-type</code> (<code>PT006</code>) to
edit both <code>argnames</code> and <code>argvalues</code> if both of
them are single-element tuples/lists (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14699">#14699</a>)</li>
<li>[<code>perflint</code>] Improve autofix for <code>PERF401</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14369">#14369</a>)</li>
<li>[<code>pylint</code>] Fix <code>PLW1508</code> false positive for
default string created via a mult operation (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14841">#14841</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="53f2d72e02"><code>53f2d72</code></a>
Revert certain double quotes from workflow shell script (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14939">#14939</a>)</li>
<li><a
href="3629cbf35a"><code>3629cbf</code></a>
Use double quotes consistently for shell scripts (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14938">#14938</a>)</li>
<li><a
href="37f433814c"><code>37f4338</code></a>
Bump version to 0.8.3 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14937">#14937</a>)</li>
<li><a
href="45b565cbb5"><code>45b565c</code></a>
[red-knot] <code>Any</code> cannot be parameterized (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14933">#14933</a>)</li>
<li><a
href="82faa9bb62"><code>82faa9b</code></a>
Add tests demonstrating f-strings with debug expressions in replacements
that...</li>
<li><a
href="2eac00c60f"><code>2eac00c</code></a>
[<code>perflint</code>] fix invalid hoist in <code>perf401</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14369">#14369</a>)</li>
<li><a
href="033ecf5a4b"><code>033ecf5</code></a>
Also have zizmor check for low-severity security issues (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14893">#14893</a>)</li>
<li><a
href="5509a3d7ae"><code>5509a3d</code></a>
Add LSP settings example for Zed editor (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14894">#14894</a>)</li>
<li><a
href="e4885a2fb2"><code>e4885a2</code></a>
[red-knot] Understand <code>typing.Tuple</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14927">#14927</a>)</li>
<li><a
href="a7e5e42b88"><code>a7e5e42</code></a>
[red-knot] Make <code>attributes.md</code> test future-proof (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14923">#14923</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.2...0.8.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.8.2&new-version=0.8.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-17 08:47:56 +00:00
dependabot[bot] 9f9097c62f
chore(backend/deps): Update `cryptography` from 43.0.3 to 44.0.0 (#8870)
Bumps [cryptography](https://github.com/pyca/cryptography) from 43.0.3
to 44.0.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst">cryptography's
changelog</a>.</em></p>
<blockquote>
<p>44.0.0 - 2024-11-27</p>
<pre><code>
* **BACKWARDS INCOMPATIBLE:** Dropped support for LibreSSL &lt; 3.9.
* Deprecated Python 3.7 support. Python 3.7 is no longer supported by
the
  Python core team. Support for Python 3.7 will be removed in a future
  ``cryptography`` release.
* Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.4.0.
* macOS wheels are now built against the macOS 10.13 SDK. Users on older
  versions of macOS should upgrade, or they will need to build
  ``cryptography`` themselves.
* Enforce the :rfc:`5280` requirement that extended key usage extensions
must
  not be empty.
* Added support for timestamp extraction to the
  :class:`~cryptography.fernet.MultiFernet` class.
* Relax the Authority Key Identifier requirements on root CA
certificates
  during X.509 verification to allow fields permitted by :rfc:`5280` but
  forbidden by the CA/Browser BRs.
* Added support for
:class:`~cryptography.hazmat.primitives.kdf.argon2.Argon2id`
  when using OpenSSL 3.2.0+.
* Added support for the :class:`~cryptography.x509.Admissions`
certificate extension.
* Added basic support for PKCS7 decryption (including S/MIME 3.2) via

:func:`~cryptography.hazmat.primitives.serialization.pkcs7.pkcs7_decrypt_der`,

:func:`~cryptography.hazmat.primitives.serialization.pkcs7.pkcs7_decrypt_pem`,
and

:func:`~cryptography.hazmat.primitives.serialization.pkcs7.pkcs7_decrypt_smime`.
<p>.. _v43-0-3:<br />
</code></pre></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f299a48153"><code>f299a48</code></a>
remove deprecated call (<a
href="https://redirect.github.com/pyca/cryptography/issues/12052">#12052</a>)</li>
<li><a
href="439eb0594a"><code>439eb05</code></a>
Bump version for 44.0.0 (<a
href="https://redirect.github.com/pyca/cryptography/issues/12051">#12051</a>)</li>
<li><a
href="2c5ad4d8dc"><code>2c5ad4d</code></a>
chore(deps): bump maturin from 1.7.4 to 1.7.5 in /.github/requirements
(<a
href="https://redirect.github.com/pyca/cryptography/issues/12050">#12050</a>)</li>
<li><a
href="d23968addd"><code>d23968a</code></a>
chore(deps): bump libc from 0.2.165 to 0.2.166 (<a
href="https://redirect.github.com/pyca/cryptography/issues/12049">#12049</a>)</li>
<li><a
href="133c0e02ed"><code>133c0e0</code></a>
Bump x509-limbo and/or wycheproof in CI (<a
href="https://redirect.github.com/pyca/cryptography/issues/12047">#12047</a>)</li>
<li><a
href="f2259d7aa0"><code>f2259d7</code></a>
Bump BoringSSL and/or OpenSSL in CI (<a
href="https://redirect.github.com/pyca/cryptography/issues/12046">#12046</a>)</li>
<li><a
href="e201c870b8"><code>e201c87</code></a>
fixed metadata in changelog (<a
href="https://redirect.github.com/pyca/cryptography/issues/12044">#12044</a>)</li>
<li><a
href="c6104cc366"><code>c6104cc</code></a>
Prohibit Python 3.9.0, 3.9.1 -- they have a bug that causes errors (<a
href="https://redirect.github.com/pyca/cryptography/issues/12045">#12045</a>)</li>
<li><a
href="d6cac753c2"><code>d6cac75</code></a>
Add support for decrypting S/MIME messages (<a
href="https://redirect.github.com/pyca/cryptography/issues/11555">#11555</a>)</li>
<li><a
href="b8e5bfd4d7"><code>b8e5bfd</code></a>
chore(deps): bump libc from 0.2.164 to 0.2.165 (<a
href="https://redirect.github.com/pyca/cryptography/issues/12042">#12042</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pyca/cryptography/compare/43.0.3...44.0.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=cryptography&package-manager=pip&previous-version=43.0.3&new-version=44.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-17 08:46:49 +00:00
Abhimanyu Yadav cd339b0ffc
feat(frontend) : Add optional input support for object, array, multi-select, and select as well. (#8982)
- Resolve #8976 

> Once you have checked whether this is working or not, then I will
remove the optional field block.

### Changes 
- Updated `NodeGenericInputField` to handle additional input types:
  - Added support for `array` and `object` optional types.
- Enhanced schema definitions for `string` optional type to include
enumerations

### Testing 🔍
- Verified that the new input types function correctly within the
frontend component.

<img width="517" alt="Screenshot 2024-12-13 at 7 08 22 PM"
src="https://github.com/user-attachments/assets/1e4b7c58-2ddc-4082-8a9e-2e11b91495e2"
/>

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-17 03:52:26 +00:00
Abhimanyu Yadav 569222e9cd
feat(blocks): Add `depends_on` support for input fields (#8852)
- Resolves part of #8731 

### Changes
- Added `depends_on` parameter to SchemaField in `model.py` to specify
field dependencies.
- Updated `useAgentGraph` hook to validate input fields based on their
dependencies, ensuring required fields are set when dependent fields are
filled.
- Modified `BlockIOSubSchemaMeta` to include `depends_on` as an optional
property.



https://github.com/user-attachments/assets/64fd47b3-34dc-48fa-ad90-1c9c5cd4c4a3

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-17 03:20:19 +00:00
Abhimanyu Yadav 2fe6eb1df1
feat(blocks): Add support for mutually exclusive input fields (#8856)
- resolves part of #8731 

### Changes
- Introduced `mutually_exclusive` parameter in `SchemaField` to manage
input exclusivity.
- Implemented logic in `NodeGenericInputField` to disable inputs based
on mutual exclusivity.
- Updated related components to support the new `disabled` state for
inputs.
- Enhanced `BlockIOSubSchemaMeta` to include `mutually_exclusive`
property.

> Currently, I’m disabling the input from the same group (I haven’t
added any frontend validation to prevent users from bypassing it).


https://github.com/user-attachments/assets/71fb9fe4-943b-4724-8acb-6aed2232ed6b

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-16 23:30:21 +00:00
Nicholas Tindle f588b69484
fix: merge issues from store -> dev (#9016)
<!-- Clearly explain the need for these changes: -->

Monitor page is broken for me
### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

- Updates monitor page to what was in dev before store pr went in
- Updates graph getting endpoint to handle invalid graphs a bit more
graceful

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test to make sure I can start and view my monitor page
2024-12-16 16:46:53 -06:00
Swifty be6d8cbd18
fix(platform): Restored monitor page and monitor spec code. (#8992)
## Changes 🏗️
	
•	Restored monitor page and monitor spec functionality.
•	Disabled failing tests to allow for smoother CI/CD processes.
2024-12-16 18:11:23 +00:00
Swifty 2de5e3dd83
feat(platform): Agent Store V2 (#8874)
# 🌎 Overview

AutoGPT Store Version 2 expands on the Pre-Store by enhancing agent
discovery, providing richer content presentation, and introducing new
user engagement features. The focus is on creating a visually appealing
and interactive marketplace that allows users to explore and evaluate
agents through images, videos, and detailed descriptions.

### Vision

To create a visually compelling and interactive open-source marketplace
for autonomous AI agents, where users can easily discover, evaluate, and
interact with agents through media-rich listings, ratings, and version
history.

### Objectives

📊 Incorporate visuals (icons, images, videos) into agent listings.
 Introduce a rating system and agent run count.
🔄 Provide version history and update logs from creators.
🔍 Improve user experience with advanced search and filtering features.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
Co-authored-by: Aarushi <aarushik93@gmail.com>
2024-12-13 16:35:02 +00:00
Ace 94a312a279
Ollama - Remote hosts (#8234)
### Background

Currently, AutoGPT only supports ollama servers running locally. Often,
this is not the case as the ollama server could be running on a more
suited instance, such as a Jetson board. This PR adds "ollama host" to
the input of all LLM blocks, allowing users to select the ollama host
for the LLM blocks.

### Changes 🏗️

- Changes contained within blocks/llm.py:
    - Adding ollama host input to all LLM blocks
- Fixed incorrect parsing of prompt when passing to ollama in the
StructuredResponse block
    - Used ollama.Client instances to accomplish this.


### Testing 🔍

Tested all LLM blocks with Ollama remote hosts as well as with the
default localhost value.


### Related issues
https://github.com/Significant-Gravitas/AutoGPT/issues/8225

---------

Co-authored-by: Fried-Squid <Fried-Squid>
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
2024-12-13 00:02:49 +00:00
Toran Bruce Richards de3c096e23
feat(blocks): Add CreateDictionaryBlock and CreateListBlock (#8903)
Though this is technically possible with the AddToDictionary and
AddToList Blocks, that approach alone feels like a hidden work-around
rather than an intuitive feature, and I'm happy with the duplication in
the name of better experience for our users here.

Changes 🏗️
Added CreateDictionaryBlock class that creates a dictionary from the
provided key-value pairs.
Added CreateListBlock class that creates a list from the provided
values.


![dictionary](https://github.com/user-attachments/assets/51250715-686b-4428-aa98-eac85d3860fa)

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-12 22:45:15 +00:00
Bently f090f4ca4a
feat(blocks): Add Code extraction Block (#8778)
This adds a code extraction block, this was originally made by
https://github.com/SerchioSD I simply updated it and made it into a PR

### Changes 🏗️

Adds a new ``code_extraction_block.py`` block which has the code


![image](https://github.com/user-attachments/assets/f7e61390-94e1-49e3-b8ee-b2dc7ea03bfe)

### Updated video to show it working with latest mapped aliases


https://github.com/user-attachments/assets/a96aa708-f06f-4a00-a581-9f64d72f9ee8

---------

Co-authored-by: SerchioSD <69461657+serchiosd@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
2024-12-12 20:34:21 +00:00
dependabot[bot] 29c771ba1b
build(deps-dev): bump the development-dependencies group in /autogpt_platform/market with 2 updates (#8923)
Bumps the development-dependencies group in /autogpt_platform/market
with 2 updates: [ruff](https://github.com/astral-sh/ruff) and
[pyright](https://github.com/RobertCraigie/pyright-python).

Updates `ruff` from 0.8.1 to 0.8.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.2</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Avoid deprecated values (<code>AIR302</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14582">#14582</a>)</li>
<li>[<code>airflow</code>] Extend removed names for <code>AIR302</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14734">#14734</a>)</li>
<li>[<code>ruff</code>] Extend
<code>unnecessary-regular-expression</code> to non-literal strings
(<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14679">#14679</a>)</li>
<li>[<code>ruff</code>] Implement <code>used-dummy-variable</code>
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14611">#14611</a>)</li>
<li>[<code>ruff</code>] Implement <code>unnecessary-cast-to-int</code>
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14697">#14697</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>airflow</code>] Check <code>AIR001</code> from builtin or
providers <code>operators</code> module (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14631">#14631</a>)</li>
<li>[<code>flake8-pytest-style</code>] Remove <code>@</code> in
<code>pytest.mark.parametrize</code> rule messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14770">#14770</a>)</li>
<li>[<code>pandas-vet</code>] Skip rules if the <code>panda</code>
module hasn't been seen (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14671">#14671</a>)</li>
<li>[<code>pylint</code>] Fix false negatives for <code>ascii</code> and
<code>sorted</code> in <code>len-as-condition</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14692">#14692</a>)</li>
<li>[<code>refurb</code>] Guard <code>hashlib</code> imports and mark
<code>hashlib-digest-hex</code> fix as safe (<code>FURB181</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14694">#14694</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>[<code>flake8-import-conventions</code>] Improve syntax check for
aliases supplied in configuration for
<code>unconventional-import-alias</code> (<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14745">#14745</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Revert: [pyflakes] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14615">#14615</a>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14726">#14726</a>)</li>
<li>[<code>pep8-naming</code>] Avoid false positive for <code>class
Bar(type(foo))</code> (<code>N804</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14683">#14683</a>)</li>
<li>[<code>pycodestyle</code>] Handle f-strings properly for
<code>invalid-escape-sequence</code> (<code>W605</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14748">#14748</a>)</li>
<li>[<code>pylint</code>] Ignore <code>@overload</code> in
<code>PLR0904</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14730">#14730</a>)</li>
<li>[<code>refurb</code>] Handle non-finite decimals in
<code>verbose-decimal-constructor</code> (<code>FURB157</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14596">#14596</a>)</li>
<li>[<code>ruff</code>] Avoid emitting <code>assignment-in-assert</code>
when all references to the assigned variable are themselves inside
<code>assert</code>s (<code>RUF018</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14661">#14661</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Improve docs for <code>flake8-use-pathlib</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14741">#14741</a>)</li>
<li>Improve error messages and docs for
<code>flake8-comprehensions</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14729">#14729</a>)</li>
<li>[<code>flake8-type-checking</code>] Expands <code>TC006</code> docs
to better explain itself (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14749">#14749</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/Lokejoke"><code>@​Lokejoke</code></a></li>
<li><a
href="https://github.com/Matt-Ord"><code>@​Matt-Ord</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/Well2333"><code>@​Well2333</code></a></li>
<li><a
href="https://github.com/connorskees"><code>@​connorskees</code></a></li>
<li><a
href="https://github.com/dcreager"><code>@​dcreager</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.2</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Avoid deprecated values (<code>AIR302</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14582">#14582</a>)</li>
<li>[<code>airflow</code>] Extend removed names for <code>AIR302</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14734">#14734</a>)</li>
<li>[<code>ruff</code>] Extend
<code>unnecessary-regular-expression</code> to non-literal strings
(<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14679">#14679</a>)</li>
<li>[<code>ruff</code>] Implement <code>used-dummy-variable</code>
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14611">#14611</a>)</li>
<li>[<code>ruff</code>] Implement <code>unnecessary-cast-to-int</code>
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14697">#14697</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>airflow</code>] Check <code>AIR001</code> from builtin or
providers <code>operators</code> module (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14631">#14631</a>)</li>
<li>[<code>flake8-pytest-style</code>] Remove <code>@</code> in
<code>pytest.mark.parametrize</code> rule messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14770">#14770</a>)</li>
<li>[<code>pandas-vet</code>] Skip rules if the <code>panda</code>
module hasn't been seen (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14671">#14671</a>)</li>
<li>[<code>pylint</code>] Fix false negatives for <code>ascii</code> and
<code>sorted</code> in <code>len-as-condition</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14692">#14692</a>)</li>
<li>[<code>refurb</code>] Guard <code>hashlib</code> imports and mark
<code>hashlib-digest-hex</code> fix as safe (<code>FURB181</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14694">#14694</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>[<code>flake8-import-conventions</code>] Improve syntax check for
aliases supplied in configuration for
<code>unconventional-import-alias</code> (<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14745">#14745</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Revert: [pyflakes] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14615">#14615</a>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14726">#14726</a>)</li>
<li>[<code>pep8-naming</code>] Avoid false positive for <code>class
Bar(type(foo))</code> (<code>N804</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14683">#14683</a>)</li>
<li>[<code>pycodestyle</code>] Handle f-strings properly for
<code>invalid-escape-sequence</code> (<code>W605</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14748">#14748</a>)</li>
<li>[<code>pylint</code>] Ignore <code>@overload</code> in
<code>PLR0904</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14730">#14730</a>)</li>
<li>[<code>refurb</code>] Handle non-finite decimals in
<code>verbose-decimal-constructor</code> (<code>FURB157</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14596">#14596</a>)</li>
<li>[<code>ruff</code>] Avoid emitting <code>assignment-in-assert</code>
when all references to the assigned variable are themselves inside
<code>assert</code>s (<code>RUF018</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14661">#14661</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Improve docs for <code>flake8-use-pathlib</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14741">#14741</a>)</li>
<li>Improve error messages and docs for
<code>flake8-comprehensions</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14729">#14729</a>)</li>
<li>[<code>flake8-type-checking</code>] Expands <code>TC006</code> docs
to better explain itself (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14749">#14749</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b0e26e6fc8"><code>b0e26e6</code></a>
Bump version to 0.8.2 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14789">#14789</a>)</li>
<li><a
href="e9941cd714"><code>e9941cd</code></a>
[red-knot] Move standalone expr inference to <code>for</code> non-name
target (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14788">#14788</a>)</li>
<li><a
href="43bf1a8907"><code>43bf1a8</code></a>
Add tests for &quot;keyword as identifier&quot; syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14754">#14754</a>)</li>
<li><a
href="fda8b1f884"><code>fda8b1f</code></a>
[<code>ruff</code>] Unnecessary cast to <code>int</code>
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14697">#14697</a>)</li>
<li><a
href="2d3f557875"><code>2d3f557</code></a>
[red-knot] Fallback for <code>typing._NoDefaultType</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14783">#14783</a>)</li>
<li><a
href="bd27bfab5d"><code>bd27bfa</code></a>
[red-knot] Unify <code>setup_db()</code> functions, add
<code>TestDb</code> builder (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14777">#14777</a>)</li>
<li><a
href="155d34bbb9"><code>155d34b</code></a>
[red-knot] Infer precise types for <code>len()</code> calls (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14599">#14599</a>)</li>
<li><a
href="04c887c8fc"><code>04c887c</code></a>
Fix references for <code>async-busy-wait</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14775">#14775</a>)</li>
<li><a
href="af43bd4b0f"><code>af43bd4</code></a>
[red-knot] Gradual forms do not participate in equivalence/subtyping (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14758">#14758</a>)</li>
<li><a
href="614917769e"><code>6149177</code></a>
Remove <code>@</code> in <code>pytest.mark.parametrize</code> rule
messages (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14770">#14770</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.1...0.8.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `pyright` from 1.1.389 to 1.1.390
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ee025bc694"><code>ee025bc</code></a>
Pyright NPM Package update to 1.1.390 (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/325">#325</a>)</li>
<li>See full diff in <a
href="https://github.com/RobertCraigie/pyright-python/compare/v1.1.389...v1.1.390">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-12 20:01:58 +00:00
Reinier van der Leer 582e12c766
ci(frontend): Speed up test jobs (#8949)
- Resolves #8948

### Changes 🏗️

- Parallelize frontend test job into a per-browser matrix
- Speed up "Free Disk Space" step by disabling removal of large system
packages
2024-12-12 19:54:05 +00:00
Reinier van der Leer e3cf605e9b
feat(frontend): Disallow webhook+input or webhook+webhook in the same graph (#8861)
- Resolves #8853

Disallow combining webhook block with another webhook or input block,
because we can't run those. Our current approach to validating the input
for a graph's starting nodes prohibits such cases.

Demo:


https://github.com/user-attachments/assets/ac098765-bb5f-4218-8cd4-ad992b1b8cda

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Add a webhook-triggered block -> can't add another, also can't add
an input block
  - [x] Add an input block -> can't add a webhook-triggered block

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-12 13:56:28 +00:00
Reinier van der Leer abf73e8d66
fix(backend): Deactivate graph on delete (#8947)
- Resolves #8945

### Changes 🏗️

- Call `on_graph_deactivate` on current active version in `DELETE
/api/graphs/{graph_id}` endpoint
2024-12-11 19:52:07 +00:00
Bently b16bf42fa3
feat(frontend): Update and fix tutorial (#8943)
This is to fix [Tutorial highlight causes bottom buttons to move up
#8942](https://github.com/Significant-Gravitas/AutoGPT/issues/8942)

### Changes 🏗️

Updates the tutorial to add a short delay before moving onto the next
step which prevents the UI from lifting up in a weird way (skip to 14
seconds in the video below to see this being fixed)
Updates to some of the positioning of the steps

Video to show latest run through whole tutorial 


https://github.com/user-attachments/assets/4cf09a2f-8ed2-45bd-9909-aa92540af845
2024-12-11 19:48:48 +00:00
Reinier van der Leer 33b9eef376
refactor(backend): Simplify `CredentialsField` usage + use `ProviderName` globally (#8725)
- Resolves #8931
- Follow-up to #8358

### Changes 🏗️
- Avoid double specifying provider and cred types on `credentials`
inputs
- Move `credentials` sub-schema validation from `CredentialsField` to
`CredentialsMetaInput.validate_credentials_field_schema(..)`, which is
called in `BlockSchema.__pydantic_init_subclass__`
- Use `ProviderName` enum globally
2024-12-11 19:27:09 +00:00
Abhimanyu Yadav b8a3ffc04a
fix(frontend) : Optional number input (#8940)
- Resolve #8928 

Currently, the frontend renders a string input for an optional integer.
I have now corrected it.

![Screenshot 2024-12-11 at 10 23
48 AM](https://github.com/user-attachments/assets/a47eaf4c-97b0-458c-8a2c-fc66fdd0d770)
2024-12-11 19:25:25 +00:00
Krzysztof Czerwinski 3fd2b7ce4a
feat(backend): Update schema for PAYG System (#8944)
First step for the PAYG System.

### Changes 🏗️

- Add `stripeCustomerId` to `User` model
- Rename model `UserBlockCredit` to `CreditTransaction`
- Rename model `UserBlockCreditType` to `CreditTransactionType`
- Update related code
- Add a migration

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-11 16:52:13 +00:00
Zamil Majdy 6490b4e188
chore(platform):Refactor GraphExecution naming clash and remove unused Graph Execution functions (#8939)
This is a follow-up of
https://github.com/Significant-Gravitas/AutoGPT/pull/8752

There are several APIs and functions related to graph execution that are
unused now.
There is also confusion about the name of `GraphExecution` that exists
in graph.py & execution.py.

### Changes 🏗️

* Renamed `GraphExecution` in `execution.py` to `GraphExecutionEntry`,
this is only used as a queue entry for execution.
* Removed unused `get_graph_execution` & `list_executions` in
`execution.py`.
* Removed `with_run` option on `get_graph` function in `graph.py`.
* Removed `GraphMetaWithRuns`
* Removed exposed functions only for testing.
* Removed `executions` fields in Graph model.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Krzysztof Czerwinski <34861343+kcze@users.noreply.github.com>
2024-12-11 15:41:15 +00:00
Krzysztof Czerwinski 7a9115db18
fix(frontend): Make pins smaller and fix hover area and highlight for input pins (#8941)
Recently pins were made slightly bigger and misaligned needlessly. The
problem in the linked issue was to fix connection area, not make them
bigger.
- https://github.com/Significant-Gravitas/AutoGPT/issues/8913

### Changes 🏗️

- Revert pins size to smaller
- Fix hover area and highlight for input pins

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Connect, reconnect, remove connection
  - [x] Tutorial works

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-11 09:16:36 +00:00
Krzysztof Czerwinski 6307ca1841
feat(platform): Include all agent versions in Runs in Monitor (#8752)
The graph version is bumped on each save. While the agent version is
changed, the past execution history is gone because the monitor page
only shows the latest version's execution history.

### Changes 🏗️

- Add `get_executions` on the backend that returns all executions of all
graphs for a user
- Display all executions (for all versions) for graphs in Monitor
- Rename ts mirror type `ExecutionMeta` to `GraphExecution` for
consistency with the backend
- Remove redundant `FlowRun` type on the frontend and use
`GraphExecution` instead
- Round execution duration text in Monitor to one decimal place

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-11 01:28:50 +00:00
Zamil Majdy d827d4f9e4
feat(block): Support find all regex extraction for ExtractTextInformationBlock (#8934)
ExtractTextInformationBlock is only supporting extracting one match.

### Changes 🏗️

Adding find_all option to ExtractTextInformationBlock.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-10 21:56:06 +00:00
Zamil Majdy 984d42234c
fix(backend): Add missing DB indexes (#8929)
Some table foreign key sources are not properly indexed, causing the
potential full table scan on the code queries.

### Changes 🏗️

Added DB indexes on several tables.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-10 10:19:22 +00:00
Zamil Majdy 79c0c314e2
feat(frontend): Add field extraction handle for block with object output (#8900)
This addresses
https://github.com/Significant-Gravitas/AutoGPT/issues/8741

We have quite a few blocks with (object) outputs. The only way to really
use these is to use a "Find In Dictionary" block to pick out that
property.

If the structure of the output object is known, we should expose the
properties of the object directly as sub-outputs. This will make a huge
difference in UX and make using these blocks much much easier.

### Changes 🏗️

Recursively flatten object fields into output node handles

<img width="637" alt="image"
src="https://github.com/user-attachments/assets/dac1f691-9866-4bb7-96b7-20fa6ddbb616">
<img width="773" alt="image"
src="https://github.com/user-attachments/assets/f8e7f97c-b245-40bd-b84f-2c044f5f9e23">


### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-09 21:47:48 +00:00
dependabot[bot] e6d728b081
build(deps-dev): bump the development-dependencies group in /autogpt_platform/autogpt_libs with 2 updates (#8924)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 2 updates:
[redis](https://github.com/redis/redis-py) and
[ruff](https://github.com/astral-sh/ruff).

Updates `redis` from 5.2.0 to 5.2.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/redis/redis-py/releases">redis's
releases</a>.</em></p>
<blockquote>
<h2>5.2.1</h2>
<h1>Changes</h1>
<h2>🐛 Bug Fixes</h2>
<ul>
<li>Fixed unsecured tempfile.mktemp() command usage (<a
href="https://redirect.github.com/redis/redis-py/issues/3446">#3446</a>)</li>
<li>Fixed bug with SLOWLOG GET response parsing on Redis Software (<a
href="https://redirect.github.com/redis/redis-py/issues/3441">#3441</a>)</li>
<li>Fixed issue with invoking _close() on closed event loop (<a
href="https://redirect.github.com/redis/redis-py/issues/3438">#3438</a>)</li>
</ul>
<h2>🧰 Maintenance</h2>
<ul>
<li>Migrate test infrastructure to new custom docker images (<a
href="https://redirect.github.com/redis/redis-py/issues/3415">#3415</a>)</li>
<li>Fixed flacky test with HEXPIREAT command (<a
href="https://redirect.github.com/redis/redis-py/issues/3437">#3437</a>)</li>
</ul>
<h2>Contributors</h2>
<p>We'd like to thank all the contributors who worked on this
release!</p>
<p><a href="https://github.com/IlianIliev"><code>@​IlianIliev</code></a>
<a href="https://github.com/uglide"><code>@​uglide</code></a> <a
href="https://github.com/vladvildanov"><code>@​vladvildanov</code></a>
<a href="https://github.com/teodorfn"><code>@​teodorfn</code></a> <a
href="https://github.com/akx"><code>@​akx</code></a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a74fa6a3dc"><code>a74fa6a</code></a>
Release 5.2.1 (<a
href="https://redirect.github.com/redis/redis-py/issues/3451">#3451</a>)</li>
<li>See full diff in <a
href="https://github.com/redis/redis-py/compare/v5.2.0...v5.2.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `ruff` from 0.8.1 to 0.8.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.2</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Avoid deprecated values (<code>AIR302</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14582">#14582</a>)</li>
<li>[<code>airflow</code>] Extend removed names for <code>AIR302</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14734">#14734</a>)</li>
<li>[<code>ruff</code>] Extend
<code>unnecessary-regular-expression</code> to non-literal strings
(<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14679">#14679</a>)</li>
<li>[<code>ruff</code>] Implement <code>used-dummy-variable</code>
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14611">#14611</a>)</li>
<li>[<code>ruff</code>] Implement <code>unnecessary-cast-to-int</code>
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14697">#14697</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>airflow</code>] Check <code>AIR001</code> from builtin or
providers <code>operators</code> module (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14631">#14631</a>)</li>
<li>[<code>flake8-pytest-style</code>] Remove <code>@</code> in
<code>pytest.mark.parametrize</code> rule messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14770">#14770</a>)</li>
<li>[<code>pandas-vet</code>] Skip rules if the <code>panda</code>
module hasn't been seen (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14671">#14671</a>)</li>
<li>[<code>pylint</code>] Fix false negatives for <code>ascii</code> and
<code>sorted</code> in <code>len-as-condition</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14692">#14692</a>)</li>
<li>[<code>refurb</code>] Guard <code>hashlib</code> imports and mark
<code>hashlib-digest-hex</code> fix as safe (<code>FURB181</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14694">#14694</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>[<code>flake8-import-conventions</code>] Improve syntax check for
aliases supplied in configuration for
<code>unconventional-import-alias</code> (<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14745">#14745</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Revert: [pyflakes] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14615">#14615</a>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14726">#14726</a>)</li>
<li>[<code>pep8-naming</code>] Avoid false positive for <code>class
Bar(type(foo))</code> (<code>N804</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14683">#14683</a>)</li>
<li>[<code>pycodestyle</code>] Handle f-strings properly for
<code>invalid-escape-sequence</code> (<code>W605</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14748">#14748</a>)</li>
<li>[<code>pylint</code>] Ignore <code>@overload</code> in
<code>PLR0904</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14730">#14730</a>)</li>
<li>[<code>refurb</code>] Handle non-finite decimals in
<code>verbose-decimal-constructor</code> (<code>FURB157</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14596">#14596</a>)</li>
<li>[<code>ruff</code>] Avoid emitting <code>assignment-in-assert</code>
when all references to the assigned variable are themselves inside
<code>assert</code>s (<code>RUF018</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14661">#14661</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Improve docs for <code>flake8-use-pathlib</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14741">#14741</a>)</li>
<li>Improve error messages and docs for
<code>flake8-comprehensions</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14729">#14729</a>)</li>
<li>[<code>flake8-type-checking</code>] Expands <code>TC006</code> docs
to better explain itself (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14749">#14749</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/Lokejoke"><code>@​Lokejoke</code></a></li>
<li><a
href="https://github.com/Matt-Ord"><code>@​Matt-Ord</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/Well2333"><code>@​Well2333</code></a></li>
<li><a
href="https://github.com/connorskees"><code>@​connorskees</code></a></li>
<li><a
href="https://github.com/dcreager"><code>@​dcreager</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.2</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Avoid deprecated values (<code>AIR302</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14582">#14582</a>)</li>
<li>[<code>airflow</code>] Extend removed names for <code>AIR302</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14734">#14734</a>)</li>
<li>[<code>ruff</code>] Extend
<code>unnecessary-regular-expression</code> to non-literal strings
(<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14679">#14679</a>)</li>
<li>[<code>ruff</code>] Implement <code>used-dummy-variable</code>
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14611">#14611</a>)</li>
<li>[<code>ruff</code>] Implement <code>unnecessary-cast-to-int</code>
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14697">#14697</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>airflow</code>] Check <code>AIR001</code> from builtin or
providers <code>operators</code> module (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14631">#14631</a>)</li>
<li>[<code>flake8-pytest-style</code>] Remove <code>@</code> in
<code>pytest.mark.parametrize</code> rule messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14770">#14770</a>)</li>
<li>[<code>pandas-vet</code>] Skip rules if the <code>panda</code>
module hasn't been seen (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14671">#14671</a>)</li>
<li>[<code>pylint</code>] Fix false negatives for <code>ascii</code> and
<code>sorted</code> in <code>len-as-condition</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14692">#14692</a>)</li>
<li>[<code>refurb</code>] Guard <code>hashlib</code> imports and mark
<code>hashlib-digest-hex</code> fix as safe (<code>FURB181</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14694">#14694</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>[<code>flake8-import-conventions</code>] Improve syntax check for
aliases supplied in configuration for
<code>unconventional-import-alias</code> (<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14745">#14745</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Revert: [pyflakes] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14615">#14615</a>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14726">#14726</a>)</li>
<li>[<code>pep8-naming</code>] Avoid false positive for <code>class
Bar(type(foo))</code> (<code>N804</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14683">#14683</a>)</li>
<li>[<code>pycodestyle</code>] Handle f-strings properly for
<code>invalid-escape-sequence</code> (<code>W605</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14748">#14748</a>)</li>
<li>[<code>pylint</code>] Ignore <code>@overload</code> in
<code>PLR0904</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14730">#14730</a>)</li>
<li>[<code>refurb</code>] Handle non-finite decimals in
<code>verbose-decimal-constructor</code> (<code>FURB157</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14596">#14596</a>)</li>
<li>[<code>ruff</code>] Avoid emitting <code>assignment-in-assert</code>
when all references to the assigned variable are themselves inside
<code>assert</code>s (<code>RUF018</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14661">#14661</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Improve docs for <code>flake8-use-pathlib</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14741">#14741</a>)</li>
<li>Improve error messages and docs for
<code>flake8-comprehensions</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14729">#14729</a>)</li>
<li>[<code>flake8-type-checking</code>] Expands <code>TC006</code> docs
to better explain itself (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14749">#14749</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b0e26e6fc8"><code>b0e26e6</code></a>
Bump version to 0.8.2 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14789">#14789</a>)</li>
<li><a
href="e9941cd714"><code>e9941cd</code></a>
[red-knot] Move standalone expr inference to <code>for</code> non-name
target (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14788">#14788</a>)</li>
<li><a
href="43bf1a8907"><code>43bf1a8</code></a>
Add tests for &quot;keyword as identifier&quot; syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14754">#14754</a>)</li>
<li><a
href="fda8b1f884"><code>fda8b1f</code></a>
[<code>ruff</code>] Unnecessary cast to <code>int</code>
(<code>RUF046</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14697">#14697</a>)</li>
<li><a
href="2d3f557875"><code>2d3f557</code></a>
[red-knot] Fallback for <code>typing._NoDefaultType</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14783">#14783</a>)</li>
<li><a
href="bd27bfab5d"><code>bd27bfa</code></a>
[red-knot] Unify <code>setup_db()</code> functions, add
<code>TestDb</code> builder (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14777">#14777</a>)</li>
<li><a
href="155d34bbb9"><code>155d34b</code></a>
[red-knot] Infer precise types for <code>len()</code> calls (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14599">#14599</a>)</li>
<li><a
href="04c887c8fc"><code>04c887c</code></a>
Fix references for <code>async-busy-wait</code> (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14775">#14775</a>)</li>
<li><a
href="af43bd4b0f"><code>af43bd4</code></a>
[red-knot] Gradual forms do not participate in equivalence/subtyping (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14758">#14758</a>)</li>
<li><a
href="614917769e"><code>6149177</code></a>
Remove <code>@</code> in <code>pytest.mark.parametrize</code> rule
messages (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14770">#14770</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.1...0.8.2">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-09 20:50:53 +00:00
dependabot[bot] a7a526e820
build(deps): bump the production-dependencies group in /autogpt_platform/frontend with 12 updates (#8925)
Bumps the production-dependencies group in /autogpt_platform/frontend
with 12 updates:

| Package | From | To |
| --- | --- | --- |
| [@faker-js/faker](https://github.com/faker-js/faker) | `9.2.0` |
`9.3.0` |
|
[@next/third-parties](https://github.com/vercel/next.js/tree/HEAD/packages/third-parties)
| `15.0.3` | `15.0.4` |
| [@supabase/supabase-js](https://github.com/supabase/supabase-js) |
`2.46.1` | `2.47.3` |
|
[@xyflow/react](https://github.com/xyflow/xyflow/tree/HEAD/packages/react)
| `12.3.5` | `12.3.6` |
| [class-variance-authority](https://github.com/joe-bell/cva) | `0.7.0`
| `0.7.1` |
| [dotenv](https://github.com/motdotla/dotenv) | `16.4.5` | `16.4.7` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.462.0` | `0.468.0` |
| [next-themes](https://github.com/pacocoursey/next-themes) | `0.4.3` |
`0.4.4` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.4.0`
| `9.4.2` |
| [react-hook-form](https://github.com/react-hook-form/react-hook-form)
| `7.53.2` | `7.54.0` |
| [react-icons](https://github.com/react-icons/react-icons) | `5.3.0` |
`5.4.0` |
| [recharts](https://github.com/recharts/recharts) | `2.13.3` | `2.14.1`
|

Updates `@faker-js/faker` from 9.2.0 to 9.3.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/releases"><code>@​faker-js/faker</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v9.3.0</h2>
<h2>What's Changed</h2>
<ul>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3246">faker-js/faker#3246</a></li>
<li>infra: show eslint progress by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3172">faker-js/faker#3172</a></li>
<li>infra(unicorn): prefer-string-slice by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3247">faker-js/faker#3247</a></li>
<li>infra: name eslint config groups for inspection by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3249">faker-js/faker#3249</a></li>
<li>infra(ci): prepare CI for GitHub merge queues by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3245">faker-js/faker#3245</a></li>
<li>fix(internet): ensure domainWord always returns a valid value in all
locales by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3253">faker-js/faker#3253</a></li>
<li>infra: remove preflight failure comment by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3188">faker-js/faker#3188</a></li>
<li>docs(guide): remove esModuleInterop flag from config by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3192">faker-js/faker#3192</a></li>
<li>test: fix vite import warning by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3248">faker-js/faker#3248</a></li>
<li>refactor(locale): lowercase Mexican color names by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3200">faker-js/faker#3200</a></li>
<li>test: verify the generated image links are working by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3127">faker-js/faker#3127</a></li>
<li>chore(deps): update dependency eslint-plugin-file-progress to v3 by
<a href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3252">faker-js/faker#3252</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3257">faker-js/faker#3257</a></li>
<li>refactor(locale): improve zh_CN vehicle manufacturers by <a
href="https://github.com/Heuluck"><code>@​Heuluck</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3254">faker-js/faker#3254</a></li>
<li>refactor(finance): deprecate maskedNumber for removal by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3201">faker-js/faker#3201</a></li>
<li>feat: add initial seed parameter to constructors by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3220">faker-js/faker#3220</a></li>
<li>docs: faker.animal.type now has 44 possible animals by <a
href="https://github.com/s-inu"><code>@​s-inu</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3258">faker-js/faker#3258</a></li>
<li>docs: expose documentation for all utilities by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3242">faker-js/faker#3242</a></li>
<li>infra(commit-and-tag-version): auto-bump version in usage-guide by
<a href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3250">faker-js/faker#3250</a></li>
<li>infra(unicorn): consistent-function-scoping by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3255">faker-js/faker#3255</a></li>
<li>chore(deps): bump <code>@​eslint/plugin-kit</code> from 0.2.2 to
0.2.3 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3268">faker-js/faker#3268</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3271">faker-js/faker#3271</a></li>
<li>chore(test): cleanup usages of randomSeed by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3260">faker-js/faker#3260</a></li>
<li>refactor(locale): split en_AU_ocker first_names by sex by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3270">faker-js/faker#3270</a></li>
<li>infra(CI): skip required CI steps in merge_queues by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3265">faker-js/faker#3265</a></li>
<li>refactor(word): cleanup filter-word-list-by-length.ts by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3262">faker-js/faker#3262</a></li>
<li>chore: fix import styling by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3273">faker-js/faker#3273</a></li>
<li>chore: import validator functions individually by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3274">faker-js/faker#3274</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3277">faker-js/faker#3277</a></li>
<li>fix(locale): fix incorrect accents in it first_name by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3281">faker-js/faker#3281</a></li>
<li>test(image): improve error text by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3278">faker-js/faker#3278</a></li>
<li>fix(locale): add Isadora to female names in pt_BR for consistency by
<a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3282">faker-js/faker#3282</a></li>
<li>refactor(locale): split up Spanish generic first names by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3279">faker-js/faker#3279</a></li>
<li>docs(guide): fix link to <code>helpers</code> module by <a
href="https://github.com/yoshi2no"><code>@​yoshi2no</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3289">faker-js/faker#3289</a></li>
<li>docs: add missing example return value for internet.jwt by <a
href="https://github.com/xDivisionByZerox"><code>@​xDivisionByZerox</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3286">faker-js/faker#3286</a></li>
<li>refactor(locale): sort person data by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3269">faker-js/faker#3269</a></li>
<li>docs: improve example output for replaceSymbols by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3304">faker-js/faker#3304</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3305">faker-js/faker#3305</a></li>
<li>chore(deps): update all non-major dependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3292">faker-js/faker#3292</a></li>
<li>chore(deps): update codecov/codecov-action action to v5 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3299">faker-js/faker#3299</a></li>
<li>chore(deps): update dependency typescript to v5.7.2 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3296">faker-js/faker#3296</a></li>
<li>chore(deps): update dependency <code>@​vueuse/core</code> to v12 by
<a href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3300">faker-js/faker#3300</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3293">faker-js/faker#3293</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3298">faker-js/faker#3298</a></li>
<li>chore(deps): update dependency vitepress to v1.5.0 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3297">faker-js/faker#3297</a></li>
<li>chore(deps): update vitest by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3294">faker-js/faker#3294</a></li>
<li>chore(deps): update dependency prettier to v3.4.1 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3295">faker-js/faker#3295</a></li>
<li>infra(unicorn): prefer-export-from by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3272">faker-js/faker#3272</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/blob/next/CHANGELOG.md"><code>@​faker-js/faker</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.2.0...v9.3.0">9.3.0</a>
(2024-12-02)</h2>
<h3>Features</h3>
<ul>
<li>add initial seed parameter to constructors (<a
href="https://redirect.github.com/faker-js/faker/issues/3220">#3220</a>)
(<a
href="1633c8deb8">1633c8d</a>)</li>
</ul>
<h3>Changed Locales</h3>
<ul>
<li><strong>locale:</strong> improve zh_CN vehicle manufacturers (<a
href="https://redirect.github.com/faker-js/faker/issues/3254">#3254</a>)
(<a
href="9abaed1061">9abaed1</a>)</li>
<li><strong>locale:</strong> lowercase Mexican color names (<a
href="https://redirect.github.com/faker-js/faker/issues/3200">#3200</a>)
(<a
href="0d850758d0">0d85075</a>)</li>
<li><strong>locale:</strong> sort person data (<a
href="https://redirect.github.com/faker-js/faker/issues/3269">#3269</a>)
(<a
href="01e20e9695">01e20e9</a>)</li>
<li><strong>locale:</strong> split en_AU_ocker first_names by sex (<a
href="https://redirect.github.com/faker-js/faker/issues/3270">#3270</a>)
(<a
href="b0a5ad38bb">b0a5ad3</a>)</li>
<li><strong>locale:</strong> split up Spanish generic first names (<a
href="https://redirect.github.com/faker-js/faker/issues/3279">#3279</a>)
(<a
href="5d5fe30ab4">5d5fe30</a>)</li>
<li><strong>locale:</strong> update Polish city name (<a
href="https://redirect.github.com/faker-js/faker/issues/3306">#3306</a>)
(<a
href="53441b7773">53441b7</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>internet:</strong> ensure domainWord always returns a valid
value in all locales (<a
href="https://redirect.github.com/faker-js/faker/issues/3253">#3253</a>)
(<a
href="525fedc91b">525fedc</a>)</li>
<li><strong>locale:</strong> add Isadora to female names in pt_BR for
consistency (<a
href="https://redirect.github.com/faker-js/faker/issues/3282">#3282</a>)
(<a
href="b390432626">b390432</a>)</li>
<li><strong>locale:</strong> fix incorrect accents in it first_name (<a
href="https://redirect.github.com/faker-js/faker/issues/3281">#3281</a>)
(<a
href="e0fb23ef81">e0fb23e</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bdd55adc39"><code>bdd55ad</code></a>
chore(release): 9.3.0 (<a
href="https://redirect.github.com/faker-js/faker/issues/3307">#3307</a>)</li>
<li><a
href="ecb5cb4a42"><code>ecb5cb4</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3311">#3311</a>)</li>
<li><a
href="0fb42953af"><code>0fb4295</code></a>
chore(deps): update vitest to v2.1.7 (<a
href="https://redirect.github.com/faker-js/faker/issues/3310">#3310</a>)</li>
<li><a
href="fcfb873913"><code>fcfb873</code></a>
chore(deps): update dependency <code>@​vitest/ui</code> to v2.1.7 (<a
href="https://redirect.github.com/faker-js/faker/issues/3309">#3309</a>)</li>
<li><a
href="53441b7773"><code>53441b7</code></a>
refactor(locale): update Polish city name (<a
href="https://redirect.github.com/faker-js/faker/issues/3306">#3306</a>)</li>
<li><a
href="7d59cd9bfb"><code>7d59cd9</code></a>
infra(unicorn): prefer-export-from (<a
href="https://redirect.github.com/faker-js/faker/issues/3272">#3272</a>)</li>
<li><a
href="176d430036"><code>176d430</code></a>
chore(deps): update dependency prettier to v3.4.1 (<a
href="https://redirect.github.com/faker-js/faker/issues/3295">#3295</a>)</li>
<li><a
href="ed2c3a2014"><code>ed2c3a2</code></a>
chore(deps): update vitest (<a
href="https://redirect.github.com/faker-js/faker/issues/3294">#3294</a>)</li>
<li><a
href="dd3dbbdd8c"><code>dd3dbbd</code></a>
chore(deps): update dependency vitepress to v1.5.0 (<a
href="https://redirect.github.com/faker-js/faker/issues/3297">#3297</a>)</li>
<li><a
href="54cdd5b885"><code>54cdd5b</code></a>
chore(deps): update devdependencies (<a
href="https://redirect.github.com/faker-js/faker/issues/3298">#3298</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/faker-js/faker/compare/v9.2.0...v9.3.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@next/third-parties` from 15.0.3 to 15.0.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases"><code>@​next/third-parties</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v15.0.4</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting changes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>Use React 19 stable in Pages Router: <a
href="https://redirect.github.com/vercel/next.js/pull/73564">vercel/next.js#73564</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/eps1lon"><code>@​eps1lon</code></a></p>
<h2>v15.0.4-canary.48</h2>
<h3>Misc Changes</h3>
<ul>
<li>refactor(turbopack): Use <code>ResolvedVc&lt;T&gt;</code> for struct
fields in extra crates: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73451">#73451</a></li>
<li>refactor(turbopack): Use <code>ResolvedVc&lt;T&gt;</code> for struct
fields in <code>next-api</code>, final part: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73367">#73367</a></li>
<li>docs: Fix image component API reference parsing: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73658">#73658</a></li>
<li>docs: fix code block language in images-and-fonts docs: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73492">#73492</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/kdy1"><code>@​kdy1</code></a>, <a
href="https://github.com/eps1lon"><code>@​eps1lon</code></a>, and <a
href="https://github.com/JamBalaya56562"><code>@​JamBalaya56562</code></a>
for helping!</p>
<h2>v15.0.4-canary.47</h2>
<h3>Misc Changes</h3>
<ul>
<li>test: fix next-sass test: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73633">#73633</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/samcx"><code>@​samcx</code></a> for
helping!</p>
<h2>v15.0.4-canary.46</h2>
<h3>Core Changes</h3>
<ul>
<li>Use consistent error formatting in terminal: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/71909">#71909</a></li>
<li>[Segment Cache] Interception routes: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73434">#73434</a></li>
<li>Upgrade to typescript 5.7: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73594">#73594</a></li>
<li>[Segment Cache] Use LRU to manage cache data : <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73486">#73486</a></li>
<li>[Segment Cache] Add isPartial to segment prefetch : <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73528">#73528</a></li>
<li>Fix missing client reference manifest error when using route groups:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73606">#73606</a></li>
<li>feat(after): stabilize <code>unstable_after</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73605">#73605</a></li>
<li>[Segment Cache] Add isHeadPartial: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73530">#73530</a></li>
<li>fix: do not add suffix for sitemap under group routes: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73570">#73570</a></li>
<li>Dynamic IO: Improve error handling: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73607">#73607</a></li>
</ul>
<h3>Example Changes</h3>
<ul>
<li>Bump <code>examples/**</code> Eslint to v9: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73560">#73560</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d6a6aa1406"><code>d6a6aa1</code></a>
v15.0.4</li>
<li><a
href="8774088bc2"><code>8774088</code></a>
[Backport 15.0] Use React 19 stable (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73564">#73564</a>)</li>
<li>See full diff in <a
href="https://github.com/vercel/next.js/commits/v15.0.4/packages/third-parties">compare
view</a></li>
</ul>
</details>
<br />

Updates `@supabase/supabase-js` from 2.46.1 to 2.47.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-js/releases"><code>@​supabase/supabase-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.47.3</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.2...v2.47.3">2.47.3</a>
(2024-12-09)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Bind proper object to setAuth on Realtime callback (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1324">#1324</a>)
(<a
href="325c2c9b25">325c2c9</a>)</li>
</ul>
<h2>v2.47.2</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.1...v2.47.2">2.47.2</a>
(2024-12-06)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump auth-js to v2.66.1 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1325">#1325</a>)
(<a
href="0ea6d8f2c7">0ea6d8f</a>)</li>
</ul>
<h2>v2.47.1</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.0...v2.47.1">2.47.1</a>
(2024-12-05)</h2>
<h3>Reverts</h3>
<ul>
<li>Revert &quot;feat: Realtime using accessToken callback for auth (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1320">#1320</a>)&quot;
(<a
href="https://redirect.github.com/supabase/supabase-js/issues/1323">#1323</a>)
(<a
href="b9c86f503e">b9c86f5</a>),
closes <a
href="https://redirect.github.com/supabase/supabase-js/issues/1320">#1320</a>
<a
href="https://redirect.github.com/supabase/supabase-js/issues/1323">#1323</a></li>
</ul>
<h2>v2.47.0</h2>
<h1><a
href="https://github.com/supabase/supabase-js/compare/v2.46.2...v2.47.0">2.47.0</a>
(2024-12-05)</h1>
<h3>Features</h3>
<ul>
<li>Realtime using accessToken callback for auth (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1320">#1320</a>)
(<a
href="88a44dfbfd">88a44df</a>)</li>
</ul>
<h2>v2.46.2</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.1...v2.46.2">2.46.2</a>
(2024-11-27)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump up realtime-js (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1318">#1318</a>)
(<a
href="456f27e02e">456f27e</a>)</li>
</ul>
<h2>v2.46.2-rc.3</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.2-rc.2...v2.46.2-rc.3">2.46.2-rc.3</a>
(2024-11-13)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>cut release (<a
href="917cbf717c">917cbf7</a>)</li>
</ul>
<h2>v2.46.2-rc.2</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.2-rc.1...v2.46.2-rc.2">2.46.2-rc.2</a>
(2024-11-13)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="325c2c9b25"><code>325c2c9</code></a>
fix: Bind proper object to setAuth on Realtime callback (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1324">#1324</a>)</li>
<li><a
href="0ea6d8f2c7"><code>0ea6d8f</code></a>
fix: bump auth-js to v2.66.1 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1325">#1325</a>)</li>
<li><a
href="b9c86f503e"><code>b9c86f5</code></a>
Revert &quot;feat: Realtime using accessToken callback for auth (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1320">#1320</a>)&quot;
(<a
href="https://redirect.github.com/supabase/supabase-js/issues/1323">#1323</a>)</li>
<li><a
href="88a44dfbfd"><code>88a44df</code></a>
feat: Realtime using accessToken callback for auth (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1320">#1320</a>)</li>
<li><a
href="456f27e02e"><code>456f27e</code></a>
fix: bump up realtime-js (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1318">#1318</a>)</li>
<li>See full diff in <a
href="https://github.com/supabase/supabase-js/compare/v2.46.1...v2.47.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `@xyflow/react` from 12.3.5 to 12.3.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/releases"><code>@​xyflow/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.3.6</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4846">#4846</a> <a
href="7501793900"><code>75017939</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Make it possible to use expandParent with immer and other immutable
helpers</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4865">#4865</a> <a
href="2c4acc2bd9"><code>2c4acc2b</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add group node to BuiltInNode type. Thanks <a
href="https://github.com/sjdemartini"><code>@​sjdemartini</code></a>!</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4877">#4877</a> <a
href="9a8309dab8"><code>9a8309da</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
intersections for nodes with origins other than [0,0]. Thanks <a
href="https://github.com/gmvrpw"><code>@​gmvrpw</code></a>!</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4844">#4844</a> <a
href="6f11e552c3"><code>6f11e552</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Allow custom data-testid for ReactFlow component</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4816">#4816</a> <a
href="43aa52a8cd"><code>43aa52a8</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Type isValidConnection prop correctly by passing EdgeType</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4855">#4855</a> <a
href="106c2cf8e5"><code>106c2cf8</code></a>
Thanks <a
href="https://github.com/mhuggins"><code>@​mhuggins</code></a>! -
Support passing <code>path</code> element attributes to
<code>BaseEdge</code> component.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4862">#4862</a> <a
href="adf4fb4e7b"><code>adf4fb4e</code></a>
Thanks <a
href="https://github.com/bcakmakoglu"><code>@​bcakmakoglu</code></a>! -
Prevent default scrolling behavior when nodes or a selection is moved
with an arrow key press.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4875">#4875</a> <a
href="41d4743a69"><code>41d4743a</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! -
Prevent unnecessary rerenders of edges when resizing the flow.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4826">#4826</a> <a
href="5f90acdab1"><code>5f90acda</code></a>
Thanks <a href="https://github.com/chrtze"><code>@​chrtze</code></a>! -
Forward ref of the div inside Panel components.</p>
</li>
<li>
<p>Updated dependencies [<a
href="d60331e6ba"><code>d60331e6</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.47</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/blob/main/packages/react/CHANGELOG.md"><code>@​xyflow/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>12.3.6</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4846">#4846</a> <a
href="7501793900"><code>75017939</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Make it possible to use expandParent with immer and other immutable
helpers</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4865">#4865</a> <a
href="2c4acc2bd9"><code>2c4acc2b</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add group node to BuiltInNode type. Thanks <a
href="https://github.com/sjdemartini"><code>@​sjdemartini</code></a>!</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4877">#4877</a> <a
href="9a8309dab8"><code>9a8309da</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
intersections for nodes with origins other than [0,0]. Thanks <a
href="https://github.com/gmvrpw"><code>@​gmvrpw</code></a>!</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4844">#4844</a> <a
href="6f11e552c3"><code>6f11e552</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Allow custom data-testid for ReactFlow component</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4816">#4816</a> <a
href="43aa52a8cd"><code>43aa52a8</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Type isValidConnection prop correctly by passing EdgeType</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4855">#4855</a> <a
href="106c2cf8e5"><code>106c2cf8</code></a>
Thanks <a
href="https://github.com/mhuggins"><code>@​mhuggins</code></a>! -
Support passing <code>path</code> element attributes to
<code>BaseEdge</code> component.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4862">#4862</a> <a
href="adf4fb4e7b"><code>adf4fb4e</code></a>
Thanks <a
href="https://github.com/bcakmakoglu"><code>@​bcakmakoglu</code></a>! -
Prevent default scrolling behavior when nodes or a selection is moved
with an arrow key press.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4875">#4875</a> <a
href="41d4743a69"><code>41d4743a</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! -
Prevent unnecessary rerenders of edges when resizing the flow.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4826">#4826</a> <a
href="5f90acdab1"><code>5f90acda</code></a>
Thanks <a href="https://github.com/chrtze"><code>@​chrtze</code></a>! -
Forward ref of the div inside Panel components.</p>
</li>
<li>
<p>Updated dependencies [<a
href="d60331e6ba"><code>d60331e6</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.47</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ba75c01da4"><code>ba75c01</code></a>
chore(packages): bump</li>
<li><a
href="42bb083a6c"><code>42bb083</code></a>
Merge pull request <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/4877">#4877</a>
from xyflow/fix/intersection-origin2</li>
<li><a
href="7de640685a"><code>7de6406</code></a>
Fix origin not being respected, when calulating rect</li>
<li><a
href="738510b709"><code>738510b</code></a>
Prevent rerendering of EdgeRenderer by removing width &amp; height from
selector</li>
<li><a
href="a666888a1e"><code>a666888</code></a>
refacotr(types): add group node to BuiltInNode type</li>
<li><a
href="ce8c2cf7a6"><code>ce8c2cf</code></a>
Merge pull request <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/4855">#4855</a>
from mhuggins/html-element-props</li>
<li><a
href="7a8e53463e"><code>7a8e534</code></a>
chore(react): cleanup base edge</li>
<li><a
href="cc6b1d9717"><code>cc6b1d9</code></a>
remove unsused import</li>
<li><a
href="cf3bae57b1"><code>cf3bae5</code></a>
removed svg props from straight edge</li>
<li><a
href="3526963f0b"><code>3526963</code></a>
Revert back to passing props manually</li>
<li>Additional commits viewable in <a
href="https://github.com/xyflow/xyflow/commits/@xyflow/react@12.3.6/packages/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `class-variance-authority` from 0.7.0 to 0.7.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/joe-bell/cva/releases">class-variance-authority's
releases</a>.</em></p>
<blockquote>
<h2>v0.7.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Add LICENSE Comments by <a
href="https://github.com/joe-bell"><code>@​joe-bell</code></a> in <a
href="https://redirect.github.com/joe-bell/cva/pull/283">joe-bell/cva#283</a></li>
<li>chore: move clsx dependency to caret/semver range by <a
href="https://github.com/philwolstenholme"><code>@​philwolstenholme</code></a>
in <a
href="https://redirect.github.com/joe-bell/cva/pull/316">joe-bell/cva#316</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/philwolstenholme"><code>@​philwolstenholme</code></a>
made their first contribution in <a
href="https://redirect.github.com/joe-bell/cva/pull/316">joe-bell/cva#316</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1">https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="45462dd239"><code>45462dd</code></a>
class-variance-authority@0.7.1</li>
<li><a
href="c236552742"><code>c236552</code></a>
docs: change x.com references to bluesky</li>
<li><a
href="985dba91cf"><code>985dba9</code></a>
chore: move clsx dependency to caret/semver range (<a
href="https://redirect.github.com/joe-bell/cva/issues/316">#316</a>)</li>
<li><a
href="d4ded2dfcc"><code>d4ded2d</code></a>
chore: update sponsors.svg [ci skip]</li>
<li><a
href="ff1717cbe3"><code>ff1717c</code></a>
ci(schedule): adjust cron date to offset midnight traffic</li>
<li><a
href="2f96730b7b"><code>2f96730</code></a>
ci: prevent scheduled workflow running in forks</li>
<li><a
href="aaae670a35"><code>aaae670</code></a>
docs(beta): bun installation</li>
<li><a
href="69feb436b6"><code>69feb43</code></a>
update docs for bun installation (<a
href="https://redirect.github.com/joe-bell/cva/issues/261">#261</a>)</li>
<li><a
href="f9e2ea6764"><code>f9e2ea6</code></a>
chore(docs): update banner links</li>
<li><a
href="5228f0e66f"><code>5228f0e</code></a>
chore: link sponsors to raw svg</li>
<li>Additional commits viewable in <a
href="https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `dotenv` from 16.4.5 to 16.4.7
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/motdotla/dotenv/blob/master/CHANGELOG.md">dotenv's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/motdotla/dotenv/compare/v16.4.6...v16.4.7">16.4.7</a>
(2024-12-03)</h2>
<h3>Changed</h3>
<ul>
<li>Ignore <code>.tap</code> folder when publishing. (oops, sorry about
that everyone. - <a
href="https://github.com/motdotla"><code>@​motdotla</code></a>) <a
href="https://redirect.github.com/motdotla/dotenv/pull/848">#848</a></li>
</ul>
<h2><a
href="https://github.com/motdotla/dotenv/compare/v16.4.5...v16.4.6">16.4.6</a>
(2024-12-02)</h2>
<h3>Changed</h3>
<ul>
<li>Clean up stale dev dependencies <a
href="https://redirect.github.com/motdotla/dotenv/pull/847">#847</a></li>
<li>Various README updates clarifying usage and alternative solutions
using <a href="https://github.com/dotenvx/dotenvx">dotenvx</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a338d68264"><code>a338d68</code></a>
16.4.7</li>
<li><a
href="daf3e3d5cc"><code>daf3e3d</code></a>
changelog 🪵</li>
<li><a
href="fb74f6809f"><code>fb74f68</code></a>
Merge pull request <a
href="https://redirect.github.com/motdotla/dotenv/issues/848">#848</a>
from Spice-King/patch-1</li>
<li><a
href="fe87ba23b5"><code>fe87ba2</code></a>
Add .tap to .npmignore</li>
<li><a
href="0c9f764c66"><code>0c9f764</code></a>
16.4.6</li>
<li><a
href="fd5f26b6c7"><code>fd5f26b</code></a>
changelog 🪵</li>
<li><a
href="bb19b6bb55"><code>bb19b6b</code></a>
Merge pull request <a
href="https://redirect.github.com/motdotla/dotenv/issues/847">#847</a>
from motdotla/deps-updates</li>
<li><a
href="2f4e36bbe2"><code>2f4e36b</code></a>
further dev dependency cleanup</li>
<li><a
href="c2fdd0169d"><code>c2fdd01</code></a>
send to codecov</li>
<li><a
href="6707487b9e"><code>6707487</code></a>
add test coverage</li>
<li>Additional commits viewable in <a
href="https://github.com/motdotla/dotenv/compare/v16.4.5...v16.4.7">compare
view</a></li>
</ul>
</details>
<br />

Updates `lucide-react` from 0.462.0 to 0.468.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-react's
releases</a>.</em></p>
<blockquote>
<h2>New icons 0.468.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>waves-ladder</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2529">#2529</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>New icons 0.467.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>scan-heart</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2385">#2385</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>book-dashed</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2399">#2399</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>New icons 0.466.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>list-filter-plus</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2645">#2645</a>)
by <a href="https://github.com/abdeniz"><code>@​abdeniz</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>bell-dot</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2656">#2656</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>bell-minus</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2656">#2656</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>bell-off</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2656">#2656</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>bell-plus</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2656">#2656</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>bell-ring</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2656">#2656</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
<li><code>bell</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2656">#2656</a>)
by <a
href="https://github.com/karsa-mistmere"><code>@​karsa-mistmere</code></a></li>
</ul>
<h2>New icons 0.465.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>droplet-off</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2641">#2641</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>flask-conical-off</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2659">#2659</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>flask-conical</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2659">#2659</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>flask-round</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2659">#2659</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
</ul>
<h2>New icons 0.464.0</h2>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>paperclip</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2482">#2482</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
<li><code>picture-in-picture</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2481">#2481</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4f038d5fe8"><code>4f038d5</code></a>
feat(docs): add Bun.sh support to documentation (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2642">#2642</a>)</li>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.468.0/packages/lucide-react">compare
view</a></li>
</ul>
</details>
<br />

Updates `next-themes` from 0.4.3 to 0.4.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pacocoursey/next-themes/releases">next-themes's
releases</a>.</em></p>
<blockquote>
<h2>v0.4.4</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: infinite loop theme flicker by <a
href="https://github.com/arturbien"><code>@​arturbien</code></a> in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/329">pacocoursey/next-themes#329</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/o1Suleyman"><code>@​o1Suleyman</code></a> made
their first contribution in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/328">pacocoursey/next-themes#328</a></li>
<li><a href="https://github.com/arturbien"><code>@​arturbien</code></a>
made their first contribution in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/329">pacocoursey/next-themes#329</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pacocoursey/next-themes/compare/v0.4.3...v0.4.4">https://github.com/pacocoursey/next-themes/compare/v0.4.3...v0.4.4</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="57c0561b1f"><code>57c0561</code></a>
v0.4.4</li>
<li><a
href="ae2ab9b47a"><code>ae2ab9b</code></a>
fix: infinite loop theme flicker (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/329">#329</a>)</li>
<li><a
href="32ef714130"><code>32ef714</code></a>
Fix &quot;With Tailwind&quot; link (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/328">#328</a>)</li>
<li>See full diff in <a
href="https://github.com/pacocoursey/next-themes/compare/v0.4.3...v0.4.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-day-picker` from 9.4.0 to 9.4.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gpbl/react-day-picker/releases">react-day-picker's
releases</a>.</em></p>
<blockquote>
<h2>v9.4.2</h2>
<p>This release addresses some bugs in the dropdown caption layout.</p>
<h2>What's Changed</h2>
<ul>
<li>fix: display all available years in the dropdown by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2614">gpbl/react-day-picker#2614</a></li>
<li>fix: display all months in dropdown by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2619">gpbl/react-day-picker#2619</a></li>
<li>docs: update styling.mdx by <a
href="https://github.com/AlecRust"><code>@​AlecRust</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2611">gpbl/react-day-picker#2611</a></li>
<li>docs: code typo in input-fields.mdx by <a
href="https://github.com/pkgacek"><code>@​pkgacek</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2613">gpbl/react-day-picker#2613</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/AlecRust"><code>@​AlecRust</code></a>
made their first contribution in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2611">gpbl/react-day-picker#2611</a></li>
<li><a href="https://github.com/pkgacek"><code>@​pkgacek</code></a> made
their first contribution in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2613">gpbl/react-day-picker#2613</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.1...v9.4.2">https://github.com/gpbl/react-day-picker/compare/v9.4.1...v9.4.2</a></p>
<h2>v9.4.1</h2>
<p>This release improves support for screen readers and fixes a
VoiceOver issue when navigating the calendar.</p>
<h2>What's Changed</h2>
<ul>
<li>fix(a11y): improve screen reader and VoiceOver support by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2609">gpbl/react-day-picker#2609</a></li>
<li>feat(a11y): added <code>role</code> and <code>aria-label</code>
props by <a href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2609">gpbl/react-day-picker#2609</a></li>
<li>chore(style): remove unused CSS variable by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2610">gpbl/react-day-picker#2610</a></li>
<li>chore: use callbacks for dropdown event handlers by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2602">gpbl/react-day-picker#2602</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.0...v9.4.1">https://github.com/gpbl/react-day-picker/compare/v9.4.0...v9.4.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a617141132"><code>a617141</code></a>
build: bump v9.4.2</li>
<li><a
href="63303e772a"><code>63303e7</code></a>
fix: dropdown may miss some disabled months (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2619">#2619</a>)</li>
<li><a
href="3db609c02f"><code>3db609c</code></a>
chore(test): update variable name</li>
<li><a
href="203fc22b1f"><code>203fc22</code></a>
fix: enable all years available in the year select dropdown (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2614">#2614</a>)</li>
<li><a
href="90ed771859"><code>90ed771</code></a>
website: update border-radius for admonition</li>
<li><a
href="1154e3b3b1"><code>1154e3b</code></a>
website: update styles</li>
<li><a
href="62bbc85796"><code>62bbc85</code></a>
website: remove draft documents</li>
<li><a
href="fd6975daa2"><code>fd6975d</code></a>
docs: updatein input-fields.mdx code sample (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2613">#2613</a>)</li>
<li><a
href="661c585ce3"><code>661c585</code></a>
docs: update styling.mdx (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2611">#2611</a>)</li>
<li><a
href="35a2824c22"><code>35a2824</code></a>
chore(style): remove unused CSS variable (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2610">#2610</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.0...v9.4.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-hook-form` from 7.53.2 to 7.54.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/react-hook-form/react-hook-form/releases">react-hook-form's
releases</a>.</em></p>
<blockquote>
<h2>Version 7.54.0</h2>
<p>🦥 fix: useForm should return a new object on formState changes (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12424">#12424</a>)
🧻 improve prototype pollution check (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12431">#12431</a>)
🪖 fix: add FileList availability check for environments without FileList
support (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12332">#12332</a>)
🧪 close <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12198">#12198</a>
memo for useController and useFormState (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12421">#12421</a>)
🐞 fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12407">#12407</a>
useFieldArray append issue with useForm disabled props (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12420">#12420</a>)
🐞 fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12415">#12415</a>
issue with flatten object with null value (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12418">#12418</a>)
🐞 fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12385">#12385</a>
nested array field invalid validation report on removed (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12405">#12405</a>)
🙀 fix: hasPromiseValidation return true or false appropriately. (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12389">#12389</a>)
👃 fix more staled props (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12404">#12404</a>)</p>
<p>thanks to <a
href="https://github.com/developer-bandi"><code>@​developer-bandi</code></a>,
<a href="https://github.com/OlegDev1"><code>@​OlegDev1</code></a>, <a
href="https://github.com/sukvvon"><code>@​sukvvon</code></a>, <a
href="https://github.com/alexandredev3"><code>@​alexandredev3</code></a>
and <a
href="https://github.com/mfazekas"><code>@​mfazekas</code></a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="893ffcecee"><code>893ffce</code></a>
7.54.0</li>
<li><a
href="9532038f15"><code>9532038</code></a>
❤️ thank you so much for St. Galler Kantonalbank AG sponsor</li>
<li><a
href="5db95c93b7"><code>5db95c9</code></a>
🐸 update SECURITY.md</li>
<li><a
href="09a9a495ec"><code>09a9a49</code></a>
🦥 fix: useForm should return a new object on formState changes (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12424">#12424</a>)</li>
<li><a
href="0952f7e26d"><code>0952f7e</code></a>
🧻 improve prototype pollution check (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12431">#12431</a>)</li>
<li><a
href="30ea87e203"><code>30ea87e</code></a>
🪖 fix: add <code>FileList</code> availability check for environments
without <code>FileList</code> ...</li>
<li><a
href="29ae596609"><code>29ae596</code></a>
🧪 close <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12198">#12198</a>
memo for useController and useFormState (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12421">#12421</a>)</li>
<li><a
href="2d7b78981f"><code>2d7b789</code></a>
🐞 fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12407">#12407</a>
useFieldArray append issue with useForm disabled props (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12420">#12420</a>)</li>
<li><a
href="00e39c8a28"><code>00e39c8</code></a>
🐞 fix <a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12415">#12415</a>
issue with flatten object with null value (<a
href="https://redirect.github.com/react-hook-form/react-hook-form/issues/12418">#12418</a>)</li>
<li><a
href="2b1c709815"><code>2b1c709</code></a>
Update README.md</li>
<li>Additional commits viewable in <a
href="https://github.com/react-hook-form/react-hook-form/compare/v7.53.2...v7.54.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-icons` from 5.3.0 to 5.4.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/react-icons/react-icons/releases">react-icons's
releases</a>.</em></p>
<blockquote>
<h2>v5.4.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Add closing of the icon details modal with the ESC key by <a
href="https://github.com/gabrielogregorio"><code>@​gabrielogregorio</code></a>
in <a
href="https://redirect.github.com/react-icons/react-icons/pull/900">react-icons/react-icons#900</a></li>
<li>support moduleResolution: bundler in tsconfig by <a
href="https://github.com/kamijin-fanta"><code>@​kamijin-fanta</code></a>
in <a
href="https://redirect.github.com/react-icons/react-icons/pull/970">react-icons/react-icons#970</a></li>
<li>min search length changed to 2 by <a
href="https://github.com/Kumar06Lav"><code>@​Kumar06Lav</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/967">react-icons/react-icons#967</a></li>
<li>Bump webpack from 5.89.0 to 5.94.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/975">react-icons/react-icons#975</a></li>
<li>Bump micromatch from 4.0.5 to 4.0.8 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/976">react-icons/react-icons#976</a></li>
<li>Bump axios from 1.6.8 to 1.7.7 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/977">react-icons/react-icons#977</a></li>
<li>preview: Reduce the number of results displayed in search results by
<a
href="https://github.com/kamijin-fanta"><code>@​kamijin-fanta</code></a>
in <a
href="https://redirect.github.com/react-icons/react-icons/pull/997">react-icons/react-icons#997</a></li>
<li>2024-12-03 upgrade icons by <a
href="https://github.com/kamijin-fanta"><code>@​kamijin-fanta</code></a>
in <a
href="https://redirect.github.com/react-icons/react-icons/pull/998">react-icons/react-icons#998</a></li>
<li>Bump dset from 3.1.3 to 3.1.4 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/979">react-icons/react-icons#979</a></li>
<li>Bump express from 4.19.2 to 4.21.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/981">react-icons/react-icons#981</a></li>
<li>Bump rollup from 2.79.1 to 2.79.2 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/982">react-icons/react-icons#982</a></li>
<li>Bump http-proxy-middleware from 2.0.6 to 2.0.7 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/989">react-icons/react-icons#989</a></li>
<li>Bump cross-spawn from 7.0.3 to 7.0.6 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/994">react-icons/react-icons#994</a></li>
<li>workflow: upgrade workflows by <a
href="https://github.com/kamijin-fanta"><code>@​kamijin-fanta</code></a>
in <a
href="https://redirect.github.com/react-icons/react-icons/pull/999">react-icons/react-icons#999</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/Kumar06Lav"><code>@​Kumar06Lav</code></a> made
their first contribution in <a
href="https://redirect.github.com/react-icons/react-icons/pull/967">react-icons/react-icons#967</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/react-icons/react-icons/compare/v5.3.0...v5.4.0">https://github.com/react-icons/react-icons/compare/v5.3.0...v5.4.0</a></p>
<table>
<thead>
<tr>
<th>Icon Library</th>
<th>License</th>
<th>Version</th>
<th align="right">Count</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="https://circumicons.com/">Circum Icons</a></td>
<td><a
href="https://github.com/Klarr-Agency/Circum-Icons/blob/main/LICENSE">MPL-2.0
license</a></td>
<td>1.0.0</td>
<td align="right">288</td>
</tr>
<tr>
<td><a href="https://fontawesome.com/">Font Awesome 5</a></td>
<td><a href="https://creativecommons.org/licenses/by/4.0/">CC BY 4.0
License</a></td>
<td>5.15.4-3-gafecf2a</td>
<td align="right">1612</td>
</tr>
<tr>
<td><a href="https://fontawesome.com/">Font Awesome 6</a></td>
<td><a href="https://creativecommons.org/licenses/by/4.0/">CC BY 4.0
License</a></td>
<td>6.6.0</td>
<td align="right">2050</td>
</tr>
<tr>
<td><a href="https://ionicons.com/">Ionicons 4</a></td>
<td><a
href="https://github.com/ionic-team/ionicons/blob/master/LICENSE">MIT</a></td>
<td>4.6.3</td>
<td align="right">696</td>
</tr>
<tr>
<td><a href="https://ionicons.com/">Ionicons 5</a></td>
<td><a
href="https://github.com/ionic-team/ionicons/blob/master/LICENSE">MIT</a></td>
<td>5.5.4</td>
<td align="right">1332</td>
</tr>
<tr>
<td><a href="http://google.github.io/material-design-icons/">Material
Design icons</a></td>
<td><a
href="https://github.com/google/material-design-icons/blob/master/LICENSE">Apache
License Version 2.0</a></td>
<td>4.0.0-125-gef43291c4d</td>
<td align="right">4341</td>
</tr>
<tr>
<td><a href="http://s-ings.com/typicons/">Typicons</a></td>
<td><a href="https://creativecommons.org/licenses/by-sa/3.0/">CC BY-SA
3.0</a></td>
<td>2.1.2</td>
<td align="right">336</td>
</tr>
<tr>
<td><a href="https://octicons.github.com/">Github Octicons
icons</a></td>
<td><a
href="https://github.com/primer/octicons/blob/master/LICENSE">MIT</a></td>
<td>18.3.0</td>
<td align="right">264</td>
</tr>
<tr>
<td><a href="https://feathericons.com/">Feather</a></td>
<td><a
href="https://github.com/feathericons/feather/blob/master/LICENSE">MIT</a></td>
<td>4.29.2</td>
<td align="right">287<...

_Description has been truncated_

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-09 20:49:31 +00:00
dependabot[bot] df431d71ff
build(deps): bump the production-dependencies group in /autogpt_platform/market with 2 updates (#8922)
Bumps the production-dependencies group in /autogpt_platform/market with
2 updates: [fastapi](https://github.com/fastapi/fastapi) and
[sentry-sdk](https://github.com/getsentry/sentry-python).

Updates `fastapi` from 0.115.5 to 0.115.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/fastapi/fastapi/releases">fastapi's
releases</a>.</em></p>
<blockquote>
<h2>0.115.6</h2>
<h3>Fixes</h3>
<ul>
<li>🐛 Preserve traceback when an exception is raised in sync dependency
with <code>yield</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/5823">#5823</a>
by <a href="https://github.com/sombek"><code>@​sombek</code></a>.</li>
</ul>
<h3>Refactors</h3>
<ul>
<li>♻️ Update tests and internals for compatibility with Pydantic
&gt;=2.10. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12971">#12971</a>
by <a href="https://github.com/tamird"><code>@​tamird</code></a>.</li>
</ul>
<h3>Docs</h3>
<ul>
<li>📝 Update includes format in docs with an automated script. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12950">#12950</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>📝 Update includes for
<code>docs/de/docs/advanced/using-request-directly.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12685">#12685</a>
by <a
href="https://github.com/alissadb"><code>@​alissadb</code></a>.</li>
<li>📝 Update includes for
<code>docs/de/docs/how-to/conditional-openapi.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12689">#12689</a>
by <a
href="https://github.com/alissadb"><code>@​alissadb</code></a>.</li>
</ul>
<h3>Translations</h3>
<ul>
<li>🌐 Add Traditional Chinese translation for
<code>docs/zh-hant/docs/async.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12990">#12990</a>
by <a
href="https://github.com/ILoveSorasakiHina"><code>@​ILoveSorasakiHina</code></a>.</li>
<li>🌐 Add Traditional Chinese translation for
<code>docs/zh-hant/docs/tutorial/query-param-models.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12932">#12932</a>
by <a
href="https://github.com/Vincy1230"><code>@​Vincy1230</code></a>.</li>
<li>🌐 Add Korean translation for
<code>docs/ko/docs/advanced/testing-dependencies.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12992">#12992</a>
by <a
href="https://github.com/Limsunoh"><code>@​Limsunoh</code></a>.</li>
<li>🌐 Add Korean translation for
<code>docs/ko/docs/advanced/websockets.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12991">#12991</a>
by <a
href="https://github.com/kwang1215"><code>@​kwang1215</code></a>.</li>
<li>🌐 Add Portuguese translation for
<code>docs/pt/docs/tutorial/response-model.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12933">#12933</a>
by <a
href="https://github.com/AndreBBM"><code>@​AndreBBM</code></a>.</li>
<li>🌐 Add Korean translation for
<code>docs/ko/docs/advanced/middlewares.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12753">#12753</a>
by <a
href="https://github.com/nahyunkeem"><code>@​nahyunkeem</code></a>.</li>
<li>🌐 Add Korean translation for
<code>docs/ko/docs/advanced/openapi-webhooks.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12752">#12752</a>
by <a href="https://github.com/saeye"><code>@​saeye</code></a>.</li>
<li>🌐 Add Chinese translation for
<code>docs/zh/docs/tutorial/query-param-models.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12931">#12931</a>
by <a
href="https://github.com/Vincy1230"><code>@​Vincy1230</code></a>.</li>
<li>🌐 Add Russian translation for
<code>docs/ru/docs/tutorial/query-param-models.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12445">#12445</a>
by <a
href="https://github.com/gitgernit"><code>@​gitgernit</code></a>.</li>
<li>🌐 Add Korean translation for
<code>docs/ko/docs/tutorial/query-param-models.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12940">#12940</a>
by <a href="https://github.com/jts8257"><code>@​jts8257</code></a>.</li>
<li>🔥 Remove obsolete tutorial translation to Chinese for
<code>docs/zh/docs/tutorial/sql-databases.md</code>, it references files
that are no longer on the repo. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12949">#12949</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li>⬆ [pre-commit.ci] pre-commit autoupdate. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/12954">#12954</a>
by <a
href="https://github.com/apps/pre-commit-ci"><code>@​pre-commit-ci[bot]</code></a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bb8c2a6498"><code>bb8c2a6</code></a>
🔖 Release version 0.115.6</li>
<li><a
href="905ec1edde"><code>905ec1e</code></a>
📝 Update release notes</li>
<li><a
href="4f8157588e"><code>4f81575</code></a>
🐛 Preserve traceback when exception is raised in sync dependency with
<code>yield</code>...</li>
<li><a
href="8255edfecf"><code>8255edf</code></a>
📝 Update release notes</li>
<li><a
href="53c87842b0"><code>53c8784</code></a>
🌐 Add Traditional Chinese translation for
<code>docs/zh-hant/docs/async.md</code> (<a
href="https://redirect.github.com/fastapi/fastapi/issues/12990">#12990</a>)</li>
<li><a
href="297135244d"><code>2971352</code></a>
📝 Update release notes</li>
<li><a
href="8376228a49"><code>8376228</code></a>
🌐 Add Traditional Chinese translation for
`docs/zh-hant/docs/tutorial/query-p...</li>
<li><a
href="6c7873c77e"><code>6c7873c</code></a>
📝 Update release notes</li>
<li><a
href="d75b81ce3f"><code>d75b81c</code></a>
🌐 Add Korean translation for
<code>docs/ko/docs/advanced/testing-dependencies.md</code> ...</li>
<li><a
href="206037c292"><code>206037c</code></a>
📝 Update release notes</li>
<li>Additional commits viewable in <a
href="https://github.com/fastapi/fastapi/compare/0.115.5...0.115.6">compare
view</a></li>
</ul>
</details>
<br />

Updates `sentry-sdk` from 2.19.0 to 2.19.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/releases">sentry-sdk's
releases</a>.</em></p>
<blockquote>
<h2>2.19.2</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Deepcopy and ensure get_all function always terminates (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3861">#3861</a>)
by <a
href="https://github.com/cmanallen"><code>@​cmanallen</code></a></li>
<li>Cleanup chalice test environment (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3858">#3858</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>2.19.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix errors when instrumenting Django cache (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3855">#3855</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>Copy <code>scope.client</code> reference as well (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3857">#3857</a>)
by <a
href="https://github.com/sl0thentr0py"><code>@​sl0thentr0py</code></a></li>
<li>Don't give up on Spotlight on 3 errors (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3856">#3856</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>Add missing stack frames (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3673">#3673</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Fix wrong metadata type in async gRPC interceptor (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3205">#3205</a>)
by <a
href="https://github.com/fdellekart"><code>@​fdellekart</code></a></li>
<li>Rename launch darkly hook to match JS SDK (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3743">#3743</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></li>
<li>Script for checking if our instrumented libs are Python 3.13
compatible (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3425">#3425</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Improve Ray tests (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3846">#3846</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Test with Celery <code>5.5.0rc3</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3842">#3842</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix asyncio testing setup (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3832">#3832</a>)
by <a
href="https://github.com/sl0thentr0py"><code>@​sl0thentr0py</code></a></li>
<li>Bump <code>codecov/codecov-action</code> from <code>5.0.2</code> to
<code>5.0.7</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3821">#3821</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Fix CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3834">#3834</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Use new ClickHouse GH action (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3826">#3826</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md">sentry-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>2.19.2</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Deepcopy and ensure get_all function always terminates (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3861">#3861</a>)
by <a
href="https://github.com/cmanallen"><code>@​cmanallen</code></a></li>
<li>Cleanup chalice test environment (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3858">#3858</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>2.19.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix errors when instrumenting Django cache (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3855">#3855</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>Copy <code>scope.client</code> reference as well (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3857">#3857</a>)
by <a
href="https://github.com/sl0thentr0py"><code>@​sl0thentr0py</code></a></li>
<li>Don't give up on Spotlight on 3 errors (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3856">#3856</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>Add missing stack frames (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3673">#3673</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Fix wrong metadata type in async gRPC interceptor (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3205">#3205</a>)
by <a
href="https://github.com/fdellekart"><code>@​fdellekart</code></a></li>
<li>Rename launch darkly hook to match JS SDK (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3743">#3743</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></li>
<li>Script for checking if our instrumented libs are Python 3.13
compatible (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3425">#3425</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Improve Ray tests (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3846">#3846</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Test with Celery <code>5.5.0rc3</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3842">#3842</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix asyncio testing setup (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3832">#3832</a>)
by <a
href="https://github.com/sl0thentr0py"><code>@​sl0thentr0py</code></a></li>
<li>Bump <code>codecov/codecov-action</code> from <code>5.0.2</code> to
<code>5.0.7</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3821">#3821</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Fix CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3834">#3834</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Use new ClickHouse GH action (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3826">#3826</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="163762f107"><code>163762f</code></a>
release: 2.19.2</li>
<li><a
href="8f9461e1a0"><code>8f9461e</code></a>
Deepcopy and ensure get_all function always terminates (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3861">#3861</a>)</li>
<li><a
href="fd56608d46"><code>fd56608</code></a>
Merge branch 'release/2.19.1'</li>
<li><a
href="7ab7fe6749"><code>7ab7fe6</code></a>
Cleanup chalice test environment (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3858">#3858</a>)</li>
<li><a
href="231a6a1d5e"><code>231a6a1</code></a>
Update CHANGELOG.md</li>
<li><a
href="c591b64d50"><code>c591b64</code></a>
release: 2.19.1</li>
<li><a
href="7a6d460bd1"><code>7a6d460</code></a>
Copy scope.client reference as well (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3857">#3857</a>)</li>
<li><a
href="5a09770541"><code>5a09770</code></a>
fix(spotlight): Don't give up on Spotlight on 3 errors (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3856">#3856</a>)</li>
<li><a
href="31fdcfaee7"><code>31fdcfa</code></a>
fix(django): Fix errors when instrumenting Django cache (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3855">#3855</a>)</li>
<li><a
href="5891717b14"><code>5891717</code></a>
Script for checking if our instrumented libs are python 3.13 compatible
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3425">#3425</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-python/compare/2.19.0...2.19.2">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-09 20:39:06 +00:00
vishesh10 281cd2910b
fix(backend): Fix Github PR blocks (#8908)
<!-- Clearly explain the need for these changes: -->

### Background

The Github PR blocks are not able to function properly because the
correct endpoint is not getting called.
- Resolves #8667

### Changes 🏗️
* Added logic to derive correct endpoint from the given PR url.
* This logic is implemented in multiple blocks viz.
`GithubReadPullRequestBlock`, `GithubAssignPRReviewerBlock`,
`GithubUnassignPRReviewerBlock`, `GithubListPRReviewersBlock`.

### Test
* Github List PR Reviewers
<img width="511" alt="Screenshot 2024-12-03 at 11 03 59 PM"
src="https://github.com/user-attachments/assets/9c69edcf-c2f4-42d2-954d-0fc4d903ae22">

* Github Read Pull Request (Include Pr Changes checked)
<img width="417" alt="Screenshot 2024-12-06 at 12 01 41 PM"
src="https://github.com/user-attachments/assets/986fada7-7fbb-41b6-a42a-47d1e11fa562">

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-09 14:37:02 +00:00
Abhimanyu Yadav 6997e2a170
fix(frontend) : Increase size of connection pin in blocks (#8920)
- Resolve https://github.com/Significant-Gravitas/AutoGPT/issues/8913

Increase size of Connection pin

<img width="1154" alt="Screenshot 2024-12-09 at 6 44 49 PM"
src="https://github.com/user-attachments/assets/7cd1ad0d-94c3-4027-aeea-d5ecd27e498d">
2024-12-09 13:50:32 +00:00
Toran Bruce Richards 1a85eb1dcf
feat(blocks): Add new openrouter models (#8905)
**Summary:**
This PR removes the `GEMINI_FLASH_1_5_EXP` model (due to inference on
OpenRouter not working) and introduces several new models to the
`LlmModel` enum. Corresponding updates have been made to the metadata
configurations and block cost settings to reflect the changes.

**Key Changes:**
1. **Removed Models:**
   - `GEMINI_FLASH_1_5_EXP`

2. **Added New Models:**
   - `QWEN_QWQ_32B_PREVIEW`
   - `NOUSRESEARCH_HERMES_3_LLAMA_3_1_405B`
   - `NOUSRESEARCH_HERMES_3_LLAMA_3_1_70B`
   - `AMAZON_NOVA_LITE_V1`
   - `AMAZON_NOVA_MICRO_V1`
   - `AMAZON_NOVA_PRO_V1`
   - `MICROSOFT_WIZARDLM_2_8X22B`
   - `GRYPHE_MYTHOMAX_L2_13B`

3. **Metadata Updates:**
- Added metadata entries for the new models with a max output tokens of
4,000 tokens.

4. **Cost Configuration Updates:**
   - Defined block costs for the newly added models:
     - `QWEN_QWQ_32B_PREVIEW`: 2 credits
     - All other new models: 1 credit

**Testing:**
- Verified that all models can be called without errors with the AI Text
generator block
2024-12-09 11:49:15 +00:00
Nicholas Tindle b62f411518
feat(frontend): monitor tests (#8880)
<!-- Clearly explain the need for these changes: -->
We want to be able to test the monitor page with importing and exporting
agents
### Changes 🏗️

- Adds more test ids
- Builds out monitor.page.ts
- adds import export tests
- Fixes #8791, fixes #8795, fixes #8792
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
Writing/Running the automated tests
2024-12-06 21:14:33 +00:00
Kaitlyn Barnard eb79c04855
Incremental additions to platform documentation (#8898)
### Changes 🏗️

Adding incremental documentation based on YouTube series: 
- How to Submit an Agent to the AutoGPT Marketplace
- How to Download and Import an Agent from the AutoGPT Marketplace
(Local Hosting)
- Creating a Basic AI Agent with AutoGPT
- How to Edit an Agent in AutoGPT
- How to Delete an Agent in AutoGPT

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-06 19:14:15 +00:00
Aarushi d7c9742d7e
feat(frontend/feature-flags): Add LaunchDarkly feature flagging UI (#8847)
This PR allows us to feature flag on the frontend, this means we can
rollout features in stages, hide features, do AB testing etc.

### Changes 🏗️

Added a LaunchDarkly Provider
Added a withFeatureFlag component
Added two env vars for: 
- enabling LD 
- specifying the _public_ client side key

Usage: 

```
'use client'

import { useFlags } from 'launchdarkly-react-client-sdk'
import { withFeatureFlag } from '@/components/feature-flag/with-feature-flag'

function TestFlagPage() {
  const flags = useFlags()

  return (
    <div className="p-4">
      <h1>If you can see this, the feature flag is ON</h1>
      <pre>Current flag value: {JSON.stringify(flags, null, 2)}</pre>
    </div>
  )
}

export default withFeatureFlag(TestFlagPage, 'test-flag')
```

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Test plan</summary>
- Set LD to false
- Navigate to a test page, should not be visible
- Set LD to true
- Navigate to same test page, should be visible
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)
- [x] I have updated infra repo

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Bently <tomnoon9@gmail.com>
Co-authored-by: SerchioSD <69461657+serchiosd@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Abhimanyu Yadav <122007096+Abhi1992002@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-12-06 19:11:06 +00:00
Aarushi ea6c9a1152
fix(platform): Stop the start up & shutdown of LaunchDarkly on local envs (#8897)
We aren't using Launch Darkly locally and so it's not set up but it was
still attempting to shut down LaunchDarkly when the app shutdown,
causing errors on shutdown. This PR fixes that issue by entirely
disabling LD on local machines.

### Changes 🏗️

Added a contextmanager to handle LaunchDarkly start up and shutdown
Added a check for local environment in said context manager

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-06 12:41:48 +00:00
Aarushi dcfad263cb
feat(blocks): Add Exa API Blocks (#8835)
Adding Exa API blocks because it does very cool search and web scrapping

### Changes 🏗️

Adding Exa API blocks: 
Search

Added a new calendar and time input 

Added _auth.py for Exa API too.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-06 10:45:40 +00:00
Zamil Majdy 9ad9dd9fe1
fix(frontend): Agent output not being re-fetched on each agent output dialog opened (#8883)
https://github.com/user-attachments/assets/edd6908e-ecf3-45c2-94d7-3f88de70bb8f

### Changes 🏗️

`fetchBlockResults` should always be triggered when `isOutputOpen` is
true.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-06 05:15:18 +00:00
Zamil Majdy e2904136bd
fix(backend): Make sure all the obtained DB connections are able to query (#8894)
### Changes 🏗️

We've seen some symptoms where during the initial startup of the
application the obtained db connection produces an error when the
network is unreachable. This code made sure that the obtained connection
could run the query and retry it on the spot if it was unable to do so.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-06 04:51:25 +00:00
Zamil Majdy 6dba31e021
fix(backend): Enable Jinja SandboxedEnvironment for TextFormatter (#8891)
We still use plain Jinja objects for text formatting in our block codes.

### Changes 🏗️

Introduced a `TextFormatter` utility class that uses jina
SandboxedEnvironment for safer text formatting.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-06 04:21:30 +00:00
Zamil Majdy ffc3eff7e2
fix(backend): Add stricter URL validation for block requests (#8890)
We need stricter URL validation for the hostname we can request in the
block code.

### Changes 🏗️

* Canonicalization: Ensures \ are converted to /, adds http:// if
missing, and normalizes the input URL.
* Scheme Check: Only http or https are allowed.
* Hostname Validation:
    - Ensures a hostname exists.
    - Converts it to an IDNA ASCII form to prevent Unicode spoofing.
    - Verifies that the hostname matches a safe DNS pattern.
* Trusted Origins Check: Allows certain hostnames explicitly if needed.
* IP Resolution and Blocking:
    - Resolves the hostname to its IP addresses.
- Checks against a list of private/reserved IP networks to prevent SSRF
to internal services.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-06 04:21:24 +00:00
dependabot[bot] 73eafa37c6
build(deps): bump the production-dependencies group in /autogpt_platform/frontend with 5 updates (#8865)
Bumps the production-dependencies group in /autogpt_platform/frontend
with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@sentry/nextjs](https://github.com/getsentry/sentry-javascript) |
`8.40.0` | `8.42.0` |
| [@supabase/supabase-js](https://github.com/supabase/supabase-js) |
`2.46.1` | `2.46.2` |
| [class-variance-authority](https://github.com/joe-bell/cva) | `0.7.0`
| `0.7.1` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.460.0` | `0.462.0` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.4.0`
| `9.4.1` |

Updates `@sentry/nextjs` from 8.40.0 to 8.42.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/releases"><code>@​sentry/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>8.42.0</h2>
<h3>Important Changes</h3>
<ul>
<li>
<p><strong>feat(react): React Router v7 support (library) (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14513">#14513</a>)</strong></p>
<p>This release adds support for <a
href="https://reactrouter.com/home#react-router-as-a-library">React
Router v7 (library mode)</a>.
Check out the docs on how to set up the integration: <a
href="https://docs.sentry.io/platforms/javascript/guides/react/features/react-router/v7/">Sentry
React Router v7 Integration Docs</a></p>
</li>
</ul>
<h3>Deprecations</h3>
<ul>
<li>
<p><strong>feat: Warn about source-map generation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14533">#14533</a>)</strong></p>
<p>In the next major version of the SDK we will change how source maps
are generated when the SDK is added to an application.
Currently, the implementation varies a lot between different SDKs and
can be difficult to understand.
Moving forward, our goal is to turn on source maps for every framework,
unless we detect that they are explicitly turned off.
Additionally, if we end up enabling source maps, we will emit a log
message that we did so.</p>
<p>With this particular release, we are emitting warnings that source
map generation will change in the future and we print instructions on
how to prepare for the next major.</p>
</li>
<li>
<p><strong>feat(nuxt): Deprecate <code>tracingOptions</code> in favor of
<code>vueIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14530">#14530</a>)</strong></p>
<p>Currently it is possible to configure tracing options in two places
in the Sentry Nuxt SDK:</p>
<ul>
<li>In <code>Sentry.init()</code></li>
<li>Inside <code>tracingOptions</code> in
<code>Sentry.init()</code></li>
</ul>
<p>For tree-shaking purposes and alignment with the Vue SDK, it is now
recommended to instead use the newly exported
<code>vueIntegration()</code> and its <code>tracingOptions</code> option
to configure tracing options in the Nuxt SDK:</p>
<pre lang="ts"><code>// sentry.client.config.ts
import * as Sentry from '@sentry/nuxt';
<p>Sentry.init({<br />
// ...<br />
integrations: [<br />
Sentry.vueIntegration({<br />
tracingOptions: {<br />
trackComponents: true,<br />
},<br />
}),<br />
],<br />
});<br />
</code></pre></p>
</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>feat(browser-utils): Update <code>web-vitals</code> to v4.2.4 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14439">#14439</a>)</li>
<li>feat(nuxt): Expose <code>vueIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14526">#14526</a>)</li>
<li>fix(feedback): Handle css correctly in screenshot mode (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14535">#14535</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/blob/8.42.0/CHANGELOG.md"><code>@​sentry/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.42.0</h2>
<h3>Important Changes</h3>
<ul>
<li>
<p><strong>feat(react): React Router v7 support (library) (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14513">#14513</a>)</strong></p>
<p>This release adds support for <a
href="https://reactrouter.com/home#react-router-as-a-library">React
Router v7 (library mode)</a>.
Check out the docs on how to set up the integration: <a
href="https://docs.sentry.io/platforms/javascript/guides/react/features/react-router/v7/">Sentry
React Router v7 Integration Docs</a></p>
</li>
</ul>
<h3>Deprecations</h3>
<ul>
<li>
<p><strong>feat: Warn about source-map generation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14533">#14533</a>)</strong></p>
<p>In the next major version of the SDK we will change how source maps
are generated when the SDK is added to an application.
Currently, the implementation varies a lot between different SDKs and
can be difficult to understand.
Moving forward, our goal is to turn on source maps for every framework,
unless we detect that they are explicitly turned off.
Additionally, if we end up enabling source maps, we will emit a log
message that we did so.</p>
<p>With this particular release, we are emitting warnings that source
map generation will change in the future and we print instructions on
how to prepare for the next major.</p>
</li>
<li>
<p><strong>feat(nuxt): Deprecate <code>tracingOptions</code> in favor of
<code>vueIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14530">#14530</a>)</strong></p>
<p>Currently it is possible to configure tracing options in two places
in the Sentry Nuxt SDK:</p>
<ul>
<li>In <code>Sentry.init()</code></li>
<li>Inside <code>tracingOptions</code> in
<code>Sentry.init()</code></li>
</ul>
<p>For tree-shaking purposes and alignment with the Vue SDK, it is now
recommended to instead use the newly exported
<code>vueIntegration()</code> and its <code>tracingOptions</code> option
to configure tracing options in the Nuxt SDK:</p>
<pre lang="ts"><code>// sentry.client.config.ts
import * as Sentry from '@sentry/nuxt';
<p>Sentry.init({<br />
// ...<br />
integrations: [<br />
Sentry.vueIntegration({<br />
tracingOptions: {<br />
trackComponents: true,<br />
},<br />
}),<br />
],<br />
});<br />
</code></pre></p>
</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>feat(browser-utils): Update <code>web-vitals</code> to v4.2.4 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14439">#14439</a>)</li>
<li>feat(nuxt): Expose <code>vueIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14526">#14526</a>)</li>
<li>fix(feedback): Handle css correctly in screenshot mode (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14535">#14535</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="faa64d0d4d"><code>faa64d0</code></a>
release: 8.42.0</li>
<li><a
href="da3a72c3cc"><code>da3a72c</code></a>
Merge pull request <a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14538">#14538</a>
from getsentry/prepare-release/8.42.0</li>
<li><a
href="e695a5ec41"><code>e695a5e</code></a>
meta(changelog): Update changelog for 8.42.0</li>
<li><a
href="0b349eb021"><code>0b349eb</code></a>
ci(deps): Bump codecov/codecov-action from 4 to 5 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14537">#14537</a>)</li>
<li><a
href="146bafc62a"><code>146bafc</code></a>
feat(nuxt): Deprecate <code>tracingOptions</code> in favor of
<code>vueIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14530">#14530</a>)</li>
<li><a
href="87b789cfc3"><code>87b789c</code></a>
feat(browser-utils): Update <code>web-vitals</code> to v4.2.4 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14439">#14439</a>)</li>
<li><a
href="9b9ec7775c"><code>9b9ec77</code></a>
feat(nuxt): Expose <code>vueIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14526">#14526</a>)</li>
<li><a
href="44477df43d"><code>44477df</code></a>
fix(feeback): Handle css correctly in screenshot mode (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14535">#14535</a>)</li>
<li><a
href="3fdab04962"><code>3fdab04</code></a>
feat: Warn about source-map generation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14533">#14533</a>)</li>
<li><a
href="e17bd91db7"><code>e17bd91</code></a>
chore: Dedupe <code>@sentry/core</code> imports (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14529">#14529</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-javascript/compare/8.40.0...8.42.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@supabase/supabase-js` from 2.46.1 to 2.46.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-js/releases"><code>@​supabase/supabase-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.46.2</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.1...v2.46.2">2.46.2</a>
(2024-11-27)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump up realtime-js (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1318">#1318</a>)
(<a
href="456f27e02e">456f27e</a>)</li>
</ul>
<h2>v2.46.2-rc.3</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.2-rc.2...v2.46.2-rc.3">2.46.2-rc.3</a>
(2024-11-13)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>cut release (<a
href="917cbf717c">917cbf7</a>)</li>
</ul>
<h2>v2.46.2-rc.2</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.2-rc.1...v2.46.2-rc.2">2.46.2-rc.2</a>
(2024-11-13)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump postgrest-js to 1.17.4 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1310">#1310</a>)
(<a
href="64ac43bc08">64ac43b</a>)</li>
</ul>
<h2>v2.46.2-rc.1</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.46.1...v2.46.2-rc.1">2.46.2-rc.1</a>
(2024-11-06)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>postgrest-js v1.17.3 (<a
href="c6c42b6038">c6c42b6</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="456f27e02e"><code>456f27e</code></a>
fix: bump up realtime-js (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1318">#1318</a>)</li>
<li>See full diff in <a
href="https://github.com/supabase/supabase-js/compare/v2.46.1...v2.46.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `class-variance-authority` from 0.7.0 to 0.7.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/joe-bell/cva/releases">class-variance-authority's
releases</a>.</em></p>
<blockquote>
<h2>v0.7.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Add LICENSE Comments by <a
href="https://github.com/joe-bell"><code>@​joe-bell</code></a> in <a
href="https://redirect.github.com/joe-bell/cva/pull/283">joe-bell/cva#283</a></li>
<li>chore: move clsx dependency to caret/semver range by <a
href="https://github.com/philwolstenholme"><code>@​philwolstenholme</code></a>
in <a
href="https://redirect.github.com/joe-bell/cva/pull/316">joe-bell/cva#316</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/philwolstenholme"><code>@​philwolstenholme</code></a>
made their first contribution in <a
href="https://redirect.github.com/joe-bell/cva/pull/316">joe-bell/cva#316</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1">https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="45462dd239"><code>45462dd</code></a>
class-variance-authority@0.7.1</li>
<li><a
href="c236552742"><code>c236552</code></a>
docs: change x.com references to bluesky</li>
<li><a
href="985dba91cf"><code>985dba9</code></a>
chore: move clsx dependency to caret/semver range (<a
href="https://redirect.github.com/joe-bell/cva/issues/316">#316</a>)</li>
<li><a
href="d4ded2dfcc"><code>d4ded2d</code></a>
chore: update sponsors.svg [ci skip]</li>
<li><a
href="ff1717cbe3"><code>ff1717c</code></a>
ci(schedule): adjust cron date to offset midnight traffic</li>
<li><a
href="2f96730b7b"><code>2f96730</code></a>
ci: prevent scheduled workflow running in forks</li>
<li><a
href="aaae670a35"><code>aaae670</code></a>
docs(beta): bun installation</li>
<li><a
href="69feb436b6"><code>69feb43</code></a>
update docs for bun installation (<a
href="https://redirect.github.com/joe-bell/cva/issues/261">#261</a>)</li>
<li><a
href="f9e2ea6764"><code>f9e2ea6</code></a>
chore(docs): update banner links</li>
<li><a
href="5228f0e66f"><code>5228f0e</code></a>
chore: link sponsors to raw svg</li>
<li>Additional commits viewable in <a
href="https://github.com/joe-bell/cva/compare/v0.7.0...v0.7.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `lucide-react` from 0.460.0 to 0.462.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-react's
releases</a>.</em></p>
<blockquote>
<h2>New icons 0.462.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>image-upscale</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2462">#2462</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>grid-2x2</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2628">#2628</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
<li><code>ship</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2548">#2548</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
<li><code>shuffle</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2478">#2478</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
<li><code>venetian-mask</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/1950">#1950</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>New icons 0.461.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>calendar-sync</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2590">#2590</a>)
by <a
href="https://github.com/chessurisme"><code>@​chessurisme</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>scale-3d</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2627">#2627</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>Hotfix lucide-svelte icon imports</h2>
<p>Icons imports broke in <code>lucide-svelte</code> after
<code>0.458.0</code>.</p>
<p>This is fixed in <a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2615">#2615</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1d5c725b58"><code>1d5c725</code></a>
Fix path image backer</li>
<li><a
href="d9a011994a"><code>d9a0119</code></a>
feat(readme): add pdfme as an awesome backer (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2639">#2639</a>)</li>
<li><a
href="c6c645ca7f"><code>c6c645c</code></a>
docs(readme): Update readme files (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2634">#2634</a>)</li>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.462.0/packages/lucide-react">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-day-picker` from 9.4.0 to 9.4.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gpbl/react-day-picker/releases">react-day-picker's
releases</a>.</em></p>
<blockquote>
<h2>v9.4.1</h2>
<p>This release improves support for screen readers and fixes a
VoiceOver issue when navigating the calendar.</p>
<h2>What's Changed</h2>
<ul>
<li>fix(a11y): improve screen reader and VoiceOver support by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2609">gpbl/react-day-picker#2609</a></li>
<li>feat(a11y): added <code>role</code> and <code>aria-label</code>
props by <a href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2609">gpbl/react-day-picker#2609</a></li>
<li>chore(style): remove unused CSS variable by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2610">gpbl/react-day-picker#2610</a></li>
<li>chore: use callbacks for dropdown event handlers by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2602">gpbl/react-day-picker#2602</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.0...v9.4.1">https://github.com/gpbl/react-day-picker/compare/v9.4.0...v9.4.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="35a2824c22"><code>35a2824</code></a>
chore(style): remove unused CSS variable (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2610">#2610</a>)</li>
<li><a
href="3d994aaaf7"><code>3d994aa</code></a>
a11y: improve screen reader and VoiceOver support (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2609">#2609</a>)</li>
<li><a
href="37cc0ca1e5"><code>37cc0ca</code></a>
build: bump 9.4.1</li>
<li><a
href="105b0fb9e6"><code>105b0fb</code></a>
docs: update Time Zone guide</li>
<li><a
href="8ae3889a0b"><code>8ae3889</code></a>
Revert &quot;build(website): update dependencies&quot;</li>
<li><a
href="231b426ccd"><code>231b426</code></a>
build(website): update dependencies</li>
<li><a
href="82fd69d7e1"><code>82fd69d</code></a>
docs: add sitemap to docusaurus</li>
<li><a
href="d41e055078"><code>d41e055</code></a>
docs: fix image in anatomy.mdx</li>
<li><a
href="c9f995720e"><code>c9f9957</code></a>
docs: remove CSS modules example, add docusaurus-plugin-plausible</li>
<li><a
href="af97f1d747"><code>af97f1d</code></a>
Merge branch 'main' of <a
href="https://github.com/gpbl/react-day-picker">https://github.com/gpbl/react-day-picker</a></li>
<li>Additional commits viewable in <a
href="https://github.com/gpbl/react-day-picker/compare/v9.4.0...v9.4.1">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-06 04:07:14 +00:00
dependabot[bot] c621226554
build(deps-dev): bump the development-dependencies group in /autogpt_platform/market with 2 updates (#8871)
Bumps the development-dependencies group in /autogpt_platform/market
with 2 updates: [pytest](https://github.com/pytest-dev/pytest) and
[ruff](https://github.com/astral-sh/ruff).

Updates `pytest` from 8.3.3 to 8.3.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pytest-dev/pytest/releases">pytest's
releases</a>.</em></p>
<blockquote>
<h2>8.3.4</h2>
<h1>pytest 8.3.4 (2024-12-01)</h1>
<h2>Bug fixes</h2>
<ul>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/12592">#12592</a>:
Fixed <code>KeyError</code>{.interpreted-text role=&quot;class&quot;}
crash when using <code>--import-mode=importlib</code> in a directory
layout where a directory contains a child directory with the same
name.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/12818">#12818</a>:
Assertion rewriting now preserves the source ranges of the original
instructions, making it play well with tools that deal with the
<code>AST</code>, like <a
href="https://github.com/alexmojaki/executing">executing</a>.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/12849">#12849</a>:
ANSI escape codes for colored output now handled correctly in
<code>pytest.fail</code>{.interpreted-text role=&quot;func&quot;} with
[pytrace=False]{.title-ref}.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/9353">#9353</a>:
<code>pytest.approx</code>{.interpreted-text role=&quot;func&quot;} now
uses strict equality when given booleans.</p>
</li>
</ul>
<h2>Improved documentation</h2>
<ul>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/10558">#10558</a>:
Fix ambiguous docstring of
<code>pytest.Config.getoption</code>{.interpreted-text
role=&quot;func&quot;}.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/10829">#10829</a>:
Improve documentation on the current handling of the
<code>--basetemp</code> option and its lack of retention functionality
(<code>temporary directory location and
retention</code>{.interpreted-text role=&quot;ref&quot;}).</p>
</li>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/12866">#12866</a>:
Improved cross-references concerning the
<code>recwarn</code>{.interpreted-text role=&quot;fixture&quot;}
fixture.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/pytest-dev/pytest/issues/12966">#12966</a>:
Clarify <code>filterwarnings</code>{.interpreted-text
role=&quot;ref&quot;} docs on filter precedence/order when using
multiple <code>@pytest.mark.filterwarnings
&lt;pytest.mark.filterwarnings ref&gt;</code>{.interpreted-text
role=&quot;ref&quot;} marks.</p>
</li>
</ul>
<h2>Contributor-facing changes</h2>
<ul>
<li><a
href="https://redirect.github.com/pytest-dev/pytest/issues/12497">#12497</a>:
Fixed two failing pdb-related tests on Python 3.13.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="53f8b4e634"><code>53f8b4e</code></a>
Update pypa/gh-action-pypi-publish to v1.12.2</li>
<li><a
href="98dff36c9d"><code>98dff36</code></a>
Prepare release version 8.3.4</li>
<li><a
href="1b474e221d"><code>1b474e2</code></a>
approx: use exact comparison for bool (<a
href="https://redirect.github.com/pytest-dev/pytest/issues/13013">#13013</a>)</li>
<li><a
href="b541721529"><code>b541721</code></a>
docs: Fix wrong statement about sys.modules with importlib import mode
(<a
href="https://redirect.github.com/pytest-dev/pytest/issues/1298">#1298</a>...</li>
<li><a
href="16cb87b650"><code>16cb87b</code></a>
pytest.fail: fix ANSI escape codes for colored output (<a
href="https://redirect.github.com/pytest-dev/pytest/issues/12959">#12959</a>)
(<a
href="https://redirect.github.com/pytest-dev/pytest/issues/12990">#12990</a>)</li>
<li><a
href="be6bc812b0"><code>be6bc81</code></a>
Issue <a
href="https://redirect.github.com/pytest-dev/pytest/issues/12966">#12966</a>
Clarify filterwarnings docs on precedence when using multiple ma...</li>
<li><a
href="7aeb72bbc6"><code>7aeb72b</code></a>
Improve docs on basetemp and retention (<a
href="https://redirect.github.com/pytest-dev/pytest/issues/12912">#12912</a>)
(<a
href="https://redirect.github.com/pytest-dev/pytest/issues/12928">#12928</a>)</li>
<li><a
href="c8758414cf"><code>c875841</code></a>
Merge pull request <a
href="https://redirect.github.com/pytest-dev/pytest/issues/12917">#12917</a>
from pytest-dev/patchback/backports/8.3.x/ded1f44e5...</li>
<li><a
href="6502816d97"><code>6502816</code></a>
Merge pull request <a
href="https://redirect.github.com/pytest-dev/pytest/issues/12913">#12913</a>
from jakkdl/dontfailonbadpath</li>
<li><a
href="52135b033f"><code>52135b0</code></a>
Merge pull request <a
href="https://redirect.github.com/pytest-dev/pytest/issues/12885">#12885</a>
from The-Compiler/pdb-py311 (<a
href="https://redirect.github.com/pytest-dev/pytest/issues/12887">#12887</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pytest-dev/pytest/compare/8.3.3...8.3.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `ruff` from 0.8.0 to 0.8.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.1</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>Formatter: Avoid invalid syntax for format-spec with quotes for all
Python versions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14625">#14625</a>)</li>
<li>Formatter: Consider quotes inside format-specs when choosing the
quotes for an f-string (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14493">#14493</a>)</li>
<li>Formatter: Do not consider f-strings with escaped newlines as
multiline (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14624">#14624</a>)</li>
<li>Formatter: Fix f-string formatting in assignment statement (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14454">#14454</a>)</li>
<li>Formatter: Fix unnecessary space around power operator
(<code>**</code>) in overlong f-string expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14489">#14489</a>)</li>
<li>[<code>airflow</code>] Avoid implicit <code>schedule</code> argument
to <code>DAG</code> and <code>@dag</code> (<code>AIR301</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14581">#14581</a>)</li>
<li>[<code>flake8-builtins</code>] Exempt private built-in modules
(<code>A005</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14505">#14505</a>)</li>
<li>[<code>flake8-pytest-style</code>] Fix
<code>pytest.mark.parametrize</code> rules to check calls instead of
decorators (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14515">#14515</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>runtime-cast-value</code> (<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14511">#14511</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>unquoted-type-alias</code> (<code>TC007</code>) and
<code>quoted-type-alias</code> (<code>TC008</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/12927">#12927</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Recommend
<code>Path.iterdir()</code> over <code>os.listdir()</code>
(<code>PTH208</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14509">#14509</a>)</li>
<li>[<code>pylint</code>] Extend <code>invalid-envvar-default</code> to
detect <code>os.environ.get</code> (<code>PLW1508</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14512">#14512</a>)</li>
<li>[<code>pylint</code>] Implement <code>len-test</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14309">#14309</a>)</li>
<li>[<code>refurb</code>] Fix bug where methods defined using lambdas
were flagged by <code>FURB118</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14639">#14639</a>)</li>
<li>[<code>ruff</code>] Auto-add <code>r</code> prefix when string has
no backslashes for <code>unraw-re-pattern</code> (<code>RUF039</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14536">#14536</a>)</li>
<li>[<code>ruff</code>] Implement
<code>invalid-assert-message-literal-argument</code>
(<code>RUF040</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14488">#14488</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-nested-literal</code> (<code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14323">#14323</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-regular-expression</code> (<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14659">#14659</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>Ignore more rules for stub files (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14541">#14541</a>)</li>
<li>[<code>pep8-naming</code>] Eliminate false positives for
single-letter names (<code>N811</code>, <code>N814</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14584">#14584</a>)</li>
<li>[<code>pyflakes</code>] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14615">#14615</a>)</li>
<li>[<code>ruff</code>] Detect redirected-noqa in file-level comments
(<code>RUF101</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14635">#14635</a>)</li>
<li>[<code>ruff</code>] Mark fixes for <code>unsorted-dunder-all</code>
and <code>unsorted-dunder-slots</code> as unsafe when there are complex
comments in the sequence (<code>RUF022</code>, <code>RUF023</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14560">#14560</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Avoid fixing code to <code>None | None</code> for
<code>redundant-none-literal</code> (<code>PYI061</code>) and
<code>never-union</code> (<code>RUF020</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14583">#14583</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/14589">#14589</a>)</li>
<li>[<code>flake8-bugbear</code>] Fix
<code>mutable-contextvar-default</code> to resolve annotated function
calls properly (<code>B039</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14532">#14532</a>)</li>
<li>[<code>flake8-pyi</code>, <code>ruff</code>] Fix traversal of nested
literals and unions (<code>PYI016</code>, <code>PYI051</code>,
<code>PYI055</code>, <code>PYI062</code>, <code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14641">#14641</a>)</li>
<li>[<code>flake8-pyi</code>] Avoid rewriting invalid type expressions
in <code>unnecessary-type-union</code> (<code>PYI055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14660">#14660</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid syntax errors and type
checking problem for quoted annotations autofix (<code>TC003</code>,
<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14634">#14634</a>)</li>
<li>[<code>pylint</code>] Do not wrap function calls in parentheses in
the fix for unnecessary-dunder-call (<code>PLC2801</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14601">#14601</a>)</li>
<li>[<code>ruff</code>] Handle <code>attrs</code>'s
<code>auto_attribs</code> correctly (<code>RUF009</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14520">#14520</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/Glyphack"><code>@​Glyphack</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a
href="https://github.com/Lokejoke"><code>@​Lokejoke</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/cake-monotone"><code>@​cake-monotone</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.1</h2>
<h3>Preview features</h3>
<ul>
<li>Formatter: Avoid invalid syntax for format-spec with quotes for all
Python versions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14625">#14625</a>)</li>
<li>Formatter: Consider quotes inside format-specs when choosing the
quotes for an f-string (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14493">#14493</a>)</li>
<li>Formatter: Do not consider f-strings with escaped newlines as
multiline (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14624">#14624</a>)</li>
<li>Formatter: Fix f-string formatting in assignment statement (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14454">#14454</a>)</li>
<li>Formatter: Fix unnecessary space around power operator
(<code>**</code>) in overlong f-string expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14489">#14489</a>)</li>
<li>[<code>airflow</code>] Avoid implicit <code>schedule</code> argument
to <code>DAG</code> and <code>@dag</code> (<code>AIR301</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14581">#14581</a>)</li>
<li>[<code>flake8-builtins</code>] Exempt private built-in modules
(<code>A005</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14505">#14505</a>)</li>
<li>[<code>flake8-pytest-style</code>] Fix
<code>pytest.mark.parametrize</code> rules to check calls instead of
decorators (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14515">#14515</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>runtime-cast-value</code> (<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14511">#14511</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>unquoted-type-alias</code> (<code>TC007</code>) and
<code>quoted-type-alias</code> (<code>TC008</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/12927">#12927</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Recommend
<code>Path.iterdir()</code> over <code>os.listdir()</code>
(<code>PTH208</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14509">#14509</a>)</li>
<li>[<code>pylint</code>] Extend <code>invalid-envvar-default</code> to
detect <code>os.environ.get</code> (<code>PLW1508</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14512">#14512</a>)</li>
<li>[<code>pylint</code>] Implement <code>len-test</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14309">#14309</a>)</li>
<li>[<code>refurb</code>] Fix bug where methods defined using lambdas
were flagged by <code>FURB118</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14639">#14639</a>)</li>
<li>[<code>ruff</code>] Auto-add <code>r</code> prefix when string has
no backslashes for <code>unraw-re-pattern</code> (<code>RUF039</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14536">#14536</a>)</li>
<li>[<code>ruff</code>] Implement
<code>invalid-assert-message-literal-argument</code>
(<code>RUF040</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14488">#14488</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-nested-literal</code> (<code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14323">#14323</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-regular-expression</code> (<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14659">#14659</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>Ignore more rules for stub files (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14541">#14541</a>)</li>
<li>[<code>pep8-naming</code>] Eliminate false positives for
single-letter names (<code>N811</code>, <code>N814</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14584">#14584</a>)</li>
<li>[<code>pyflakes</code>] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14615">#14615</a>)</li>
<li>[<code>ruff</code>] Detect redirected-noqa in file-level comments
(<code>RUF101</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14635">#14635</a>)</li>
<li>[<code>ruff</code>] Mark fixes for <code>unsorted-dunder-all</code>
and <code>unsorted-dunder-slots</code> as unsafe when there are complex
comments in the sequence (<code>RUF022</code>, <code>RUF023</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14560">#14560</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Avoid fixing code to <code>None | None</code> for
<code>redundant-none-literal</code> (<code>PYI061</code>) and
<code>never-union</code> (<code>RUF020</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14583">#14583</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/14589">#14589</a>)</li>
<li>[<code>flake8-bugbear</code>] Fix
<code>mutable-contextvar-default</code> to resolve annotated function
calls properly (<code>B039</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14532">#14532</a>)</li>
<li>[<code>flake8-pyi</code>, <code>ruff</code>] Fix traversal of nested
literals and unions (<code>PYI016</code>, <code>PYI051</code>,
<code>PYI055</code>, <code>PYI062</code>, <code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14641">#14641</a>)</li>
<li>[<code>flake8-pyi</code>] Avoid rewriting invalid type expressions
in <code>unnecessary-type-union</code> (<code>PYI055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14660">#14660</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid syntax errors and type
checking problem for quoted annotations autofix (<code>TC003</code>,
<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14634">#14634</a>)</li>
<li>[<code>pylint</code>] Do not wrap function calls in parentheses in
the fix for unnecessary-dunder-call (<code>PLC2801</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14601">#14601</a>)</li>
<li>[<code>ruff</code>] Handle <code>attrs</code>'s
<code>auto_attribs</code> correctly (<code>RUF009</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14520">#14520</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b3b2c982cd"><code>b3b2c98</code></a>
Update CHANGELOG.md with the new commits for 0.8.1 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14664">#14664</a>)</li>
<li><a
href="abb3c6ea95"><code>abb3c6e</code></a>
[<code>flake8-pyi</code>] Avoid rewriting invalid type expressions in
`unnecessary-type-...</li>
<li><a
href="224fe75a76"><code>224fe75</code></a>
[<code>ruff</code>] Implement
<code>unnecessary-regular-expression</code> (<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14659">#14659</a>)</li>
<li><a
href="dc29f52750"><code>dc29f52</code></a>
[<code>flake8-pyi</code>, <code>ruff</code>] Fix traversal of nested
literals and unions (<code>PYI016</code>,...</li>
<li><a
href="d9cbf2fe44"><code>d9cbf2f</code></a>
Avoids unnecessary overhead for <code>TC004</code>, when
<code>TC001-003</code> are disabled (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14657">#14657</a>)</li>
<li><a
href="3f6c65e78c"><code>3f6c65e</code></a>
[red-knot] Fix merged type after if-else without explicit else branch
(<a
href="https://redirect.github.com/astral-sh/ruff/issues/14621">#14621</a>)</li>
<li><a
href="976c37a849"><code>976c37a</code></a>
Bump version to 0.8.1 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14655">#14655</a>)</li>
<li><a
href="a378ff38dc"><code>a378ff3</code></a>
[red-knot] Fix Boolean flags in mdtests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14654">#14654</a>)</li>
<li><a
href="d8bca0d3a2"><code>d8bca0d</code></a>
Fix bug where methods defined using lambdas were flagged by FURB118 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14639">#14639</a>)</li>
<li><a
href="6f1cf5b686"><code>6f1cf5b</code></a>
[red-knot] Minor fix in MRO tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14652">#14652</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.0...0.8.1">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-06 03:57:20 +00:00
Abhimanyu Yadav 227806aef9
feat(blocks): Add code execution block (#8768)
- Resolves #8766 

Creates a block that executes code in an E2B sandbox.

Demo:


https://github.com/user-attachments/assets/460382c4-5bf7-4f96-a539-88ab263777de

---------

Co-authored-by: Reinier van der Leer <github@pwuts.nl>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-12-06 01:16:19 +00:00
Reinier van der Leer 0272d87af3
ci(backend): Add `poetry.lock` check (#8885)
- Resolves #8884

We need to prevent breaking updates to dependency version requirements
of `autogpt_libs`.
`autogpt_libs/pytroject.toml` and `backend/poetry.lock` are loosely
coupled, and to ensure they stay in sync we need an extra check.

For now I'm also reverting the breaking update of #8787, otherwise this
added CI check will immediately fail.

### Changes
- ci(backend): Add `poetry.lock` check
- Revert "build(deps): bump pydantic from 2.9.2 to 2.10.2 in
/autogpt_platform/autogpt_libs in the production-dependencies group
across 1 directory (#8787)"
2024-12-05 18:41:59 +00:00
Reinier van der Leer 64f5e60d12
feat(blocks): Add webhook block status indicator (#8838)
- Resolves #8743
- Follow-up to #8358

### Demo


https://github.com/user-attachments/assets/f983dfa2-2dc2-4ab0-8373-e768ba17e6f7

### Changes 🏗️

- feat(frontend): Add webhook status indicator on `CustomNode`
   - Add `webhookId` to frontend node data model

- fix(backend): Fix webhook ping endpoint
   - Remove `provider` path parameter
   - Fix return values and error handling
   - Fix `WebhooksManager.trigger_ping(..)`
      - Add `credentials` parameter
      - Fix usage of credentials
   - Fix `.data.integrations.wait_for_webhook_event(..)`
      - Add `AsyncRedisEventBus.wait_for_event(..)`

- feat(frontend): Add `BackendAPIProvider` + `useBackendAPI`

- feat(frontend): Improve layout of node header

    Before:

![image](https://github.com/user-attachments/assets/17a33b94-65f0-4e34-a47d-2dd321edecae)
    After:

![image](https://github.com/user-attachments/assets/64afb1e4-e3f2-4ca9-8961-f1245f25477f)

- refactor(backend): Clean up `.data.integrations`
- refactor(backend): Fix naming in `.data.queue` for understandability

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Add webhook block, save -> gray indicator
  - [x] Add necessary info to webhook block, save -> green indicator
  - [x] Remove necessary info, save -> gray indicator

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-12-05 10:35:13 +00:00
Nicholas Tindle 6b742d1a8c
docs: add docs for writing playwright tests (#8877)
<!-- Clearly explain the need for these changes: -->
Nick wants others to be able to write tests besides Nick

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Fixes various import errors across the docs to fix dead links
- Adds Docs for making and debugging your own tests

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2024-12-04 18:17:17 +00:00
Nicholas Tindle d4edb9371d
feat(blocks): Add Slant 3D printing via API service (#8805)
<!-- Clearly explain the need for these changes: -->

I want to be able to have agents 3d print things and deliver them to my
house!

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds slant3d as a provider
- Adds slant3d order webhook (disabled on the cloud by default due to
how it notifies users)
- Adds several blocks to order from slant3d
- Diables Get Orders (for the same reason as webhook)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan

<details>
  <summary>Test Plan</summary>
  
  - [ ] Add filament block and fill API key
  - [ ] Run filament block
- [ ] Add slice block and use this value:
https://files.printables.com/media/prints/1081287/stls/8176524_a9edde2d-68c1-41de-a207-b584fcf42f30_f9127d5b-39ed-4ef8-b59f-d3a0bc874373/rod-holder.stl
  - [ ] Run slice block
- [ ] Add estimate blocks (print and shipping) and use your address, and
the above file
  - [ ] select petg and count 1
  - [ ] run the blocks
  - [ ] Create an order using same information
  - [ ] Run the block and note the order number
  - [ ] Delete the create order block so you don't keep ordering stuff
  - [ ] Run get orders block
  - [ ] Check your order exists
  - [ ] Run the cancel order block with the order id
  - [ ] run the get orders block and check the order no longer exists
</details>

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-12-04 02:44:29 +00:00
Nicholas Tindle 89011aabe0
feat(frontend): add block tests (#8804)
<!-- Clearly explain the need for these changes: -->
We want to be able to automatically test agent running creation and
building via the build page

### Changes 🏗️
- updates many UI elements to have new data ids 
- adds page for build
- adds spec for build
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run the UI Tests!

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
2024-12-03 16:10:46 +00:00
Abhimanyu Yadav 43bd5c89d7
fix(frontend): advanced-toggle-default (#8802)
- resolve #8739

I don't think so that this is a frontend issue [might be wrong] ,
because if we are not classifying that a particular input is `advanced =
true/false`. Then we automatically get `advanced = True`.

<img width="1142" alt="Screenshot 2024-11-27 at 10 36 59 AM"
src="https://github.com/user-attachments/assets/e8d9c037-5b8b-45b2-b40b-8390bc63de99">
2024-12-03 12:14:17 +00:00
dependabot[bot] 0a604a5746
build(deps-dev): bump ruff from 0.8.0 to 0.8.1 in /autogpt_platform/autogpt_libs in the development-dependencies group (#8864)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 1 update:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.8.0 to 0.8.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.8.1</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>Formatter: Avoid invalid syntax for format-spec with quotes for all
Python versions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14625">#14625</a>)</li>
<li>Formatter: Consider quotes inside format-specs when choosing the
quotes for an f-string (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14493">#14493</a>)</li>
<li>Formatter: Do not consider f-strings with escaped newlines as
multiline (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14624">#14624</a>)</li>
<li>Formatter: Fix f-string formatting in assignment statement (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14454">#14454</a>)</li>
<li>Formatter: Fix unnecessary space around power operator
(<code>**</code>) in overlong f-string expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14489">#14489</a>)</li>
<li>[<code>airflow</code>] Avoid implicit <code>schedule</code> argument
to <code>DAG</code> and <code>@dag</code> (<code>AIR301</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14581">#14581</a>)</li>
<li>[<code>flake8-builtins</code>] Exempt private built-in modules
(<code>A005</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14505">#14505</a>)</li>
<li>[<code>flake8-pytest-style</code>] Fix
<code>pytest.mark.parametrize</code> rules to check calls instead of
decorators (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14515">#14515</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>runtime-cast-value</code> (<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14511">#14511</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>unquoted-type-alias</code> (<code>TC007</code>) and
<code>quoted-type-alias</code> (<code>TC008</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/12927">#12927</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Recommend
<code>Path.iterdir()</code> over <code>os.listdir()</code>
(<code>PTH208</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14509">#14509</a>)</li>
<li>[<code>pylint</code>] Extend <code>invalid-envvar-default</code> to
detect <code>os.environ.get</code> (<code>PLW1508</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14512">#14512</a>)</li>
<li>[<code>pylint</code>] Implement <code>len-test</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14309">#14309</a>)</li>
<li>[<code>refurb</code>] Fix bug where methods defined using lambdas
were flagged by <code>FURB118</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14639">#14639</a>)</li>
<li>[<code>ruff</code>] Auto-add <code>r</code> prefix when string has
no backslashes for <code>unraw-re-pattern</code> (<code>RUF039</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14536">#14536</a>)</li>
<li>[<code>ruff</code>] Implement
<code>invalid-assert-message-literal-argument</code>
(<code>RUF040</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14488">#14488</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-nested-literal</code> (<code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14323">#14323</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-regular-expression</code> (<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14659">#14659</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>Ignore more rules for stub files (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14541">#14541</a>)</li>
<li>[<code>pep8-naming</code>] Eliminate false positives for
single-letter names (<code>N811</code>, <code>N814</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14584">#14584</a>)</li>
<li>[<code>pyflakes</code>] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14615">#14615</a>)</li>
<li>[<code>ruff</code>] Detect redirected-noqa in file-level comments
(<code>RUF101</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14635">#14635</a>)</li>
<li>[<code>ruff</code>] Mark fixes for <code>unsorted-dunder-all</code>
and <code>unsorted-dunder-slots</code> as unsafe when there are complex
comments in the sequence (<code>RUF022</code>, <code>RUF023</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14560">#14560</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Avoid fixing code to <code>None | None</code> for
<code>redundant-none-literal</code> (<code>PYI061</code>) and
<code>never-union</code> (<code>RUF020</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14583">#14583</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/14589">#14589</a>)</li>
<li>[<code>flake8-bugbear</code>] Fix
<code>mutable-contextvar-default</code> to resolve annotated function
calls properly (<code>B039</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14532">#14532</a>)</li>
<li>[<code>flake8-pyi</code>, <code>ruff</code>] Fix traversal of nested
literals and unions (<code>PYI016</code>, <code>PYI051</code>,
<code>PYI055</code>, <code>PYI062</code>, <code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14641">#14641</a>)</li>
<li>[<code>flake8-pyi</code>] Avoid rewriting invalid type expressions
in <code>unnecessary-type-union</code> (<code>PYI055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14660">#14660</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid syntax errors and type
checking problem for quoted annotations autofix (<code>TC003</code>,
<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14634">#14634</a>)</li>
<li>[<code>pylint</code>] Do not wrap function calls in parentheses in
the fix for unnecessary-dunder-call (<code>PLC2801</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14601">#14601</a>)</li>
<li>[<code>ruff</code>] Handle <code>attrs</code>'s
<code>auto_attribs</code> correctly (<code>RUF009</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14520">#14520</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/Glyphack"><code>@​Glyphack</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a
href="https://github.com/Lokejoke"><code>@​Lokejoke</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/cake-monotone"><code>@​cake-monotone</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.8.1</h2>
<h3>Preview features</h3>
<ul>
<li>Formatter: Avoid invalid syntax for format-spec with quotes for all
Python versions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14625">#14625</a>)</li>
<li>Formatter: Consider quotes inside format-specs when choosing the
quotes for an f-string (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14493">#14493</a>)</li>
<li>Formatter: Do not consider f-strings with escaped newlines as
multiline (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14624">#14624</a>)</li>
<li>Formatter: Fix f-string formatting in assignment statement (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14454">#14454</a>)</li>
<li>Formatter: Fix unnecessary space around power operator
(<code>**</code>) in overlong f-string expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14489">#14489</a>)</li>
<li>[<code>airflow</code>] Avoid implicit <code>schedule</code> argument
to <code>DAG</code> and <code>@dag</code> (<code>AIR301</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14581">#14581</a>)</li>
<li>[<code>flake8-builtins</code>] Exempt private built-in modules
(<code>A005</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14505">#14505</a>)</li>
<li>[<code>flake8-pytest-style</code>] Fix
<code>pytest.mark.parametrize</code> rules to check calls instead of
decorators (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14515">#14515</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>runtime-cast-value</code> (<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14511">#14511</a>)</li>
<li>[<code>flake8-type-checking</code>] Implement
<code>unquoted-type-alias</code> (<code>TC007</code>) and
<code>quoted-type-alias</code> (<code>TC008</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/12927">#12927</a>)</li>
<li>[<code>flake8-use-pathlib</code>] Recommend
<code>Path.iterdir()</code> over <code>os.listdir()</code>
(<code>PTH208</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14509">#14509</a>)</li>
<li>[<code>pylint</code>] Extend <code>invalid-envvar-default</code> to
detect <code>os.environ.get</code> (<code>PLW1508</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14512">#14512</a>)</li>
<li>[<code>pylint</code>] Implement <code>len-test</code>
(<code>PLC1802</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14309">#14309</a>)</li>
<li>[<code>refurb</code>] Fix bug where methods defined using lambdas
were flagged by <code>FURB118</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14639">#14639</a>)</li>
<li>[<code>ruff</code>] Auto-add <code>r</code> prefix when string has
no backslashes for <code>unraw-re-pattern</code> (<code>RUF039</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/14536">#14536</a>)</li>
<li>[<code>ruff</code>] Implement
<code>invalid-assert-message-literal-argument</code>
(<code>RUF040</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14488">#14488</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-nested-literal</code> (<code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14323">#14323</a>)</li>
<li>[<code>ruff</code>] Implement
<code>unnecessary-regular-expression</code> (<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14659">#14659</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>Ignore more rules for stub files (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14541">#14541</a>)</li>
<li>[<code>pep8-naming</code>] Eliminate false positives for
single-letter names (<code>N811</code>, <code>N814</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14584">#14584</a>)</li>
<li>[<code>pyflakes</code>] Avoid false positives in
<code>@no_type_check</code> contexts (<code>F821</code>,
<code>F722</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14615">#14615</a>)</li>
<li>[<code>ruff</code>] Detect redirected-noqa in file-level comments
(<code>RUF101</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14635">#14635</a>)</li>
<li>[<code>ruff</code>] Mark fixes for <code>unsorted-dunder-all</code>
and <code>unsorted-dunder-slots</code> as unsafe when there are complex
comments in the sequence (<code>RUF022</code>, <code>RUF023</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14560">#14560</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Avoid fixing code to <code>None | None</code> for
<code>redundant-none-literal</code> (<code>PYI061</code>) and
<code>never-union</code> (<code>RUF020</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14583">#14583</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/14589">#14589</a>)</li>
<li>[<code>flake8-bugbear</code>] Fix
<code>mutable-contextvar-default</code> to resolve annotated function
calls properly (<code>B039</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14532">#14532</a>)</li>
<li>[<code>flake8-pyi</code>, <code>ruff</code>] Fix traversal of nested
literals and unions (<code>PYI016</code>, <code>PYI051</code>,
<code>PYI055</code>, <code>PYI062</code>, <code>RUF041</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14641">#14641</a>)</li>
<li>[<code>flake8-pyi</code>] Avoid rewriting invalid type expressions
in <code>unnecessary-type-union</code> (<code>PYI055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14660">#14660</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid syntax errors and type
checking problem for quoted annotations autofix (<code>TC003</code>,
<code>TC006</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14634">#14634</a>)</li>
<li>[<code>pylint</code>] Do not wrap function calls in parentheses in
the fix for unnecessary-dunder-call (<code>PLC2801</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14601">#14601</a>)</li>
<li>[<code>ruff</code>] Handle <code>attrs</code>'s
<code>auto_attribs</code> correctly (<code>RUF009</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/14520">#14520</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b3b2c982cd"><code>b3b2c98</code></a>
Update CHANGELOG.md with the new commits for 0.8.1 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14664">#14664</a>)</li>
<li><a
href="abb3c6ea95"><code>abb3c6e</code></a>
[<code>flake8-pyi</code>] Avoid rewriting invalid type expressions in
`unnecessary-type-...</li>
<li><a
href="224fe75a76"><code>224fe75</code></a>
[<code>ruff</code>] Implement
<code>unnecessary-regular-expression</code> (<code>RUF055</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14659">#14659</a>)</li>
<li><a
href="dc29f52750"><code>dc29f52</code></a>
[<code>flake8-pyi</code>, <code>ruff</code>] Fix traversal of nested
literals and unions (<code>PYI016</code>,...</li>
<li><a
href="d9cbf2fe44"><code>d9cbf2f</code></a>
Avoids unnecessary overhead for <code>TC004</code>, when
<code>TC001-003</code> are disabled (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14657">#14657</a>)</li>
<li><a
href="3f6c65e78c"><code>3f6c65e</code></a>
[red-knot] Fix merged type after if-else without explicit else branch
(<a
href="https://redirect.github.com/astral-sh/ruff/issues/14621">#14621</a>)</li>
<li><a
href="976c37a849"><code>976c37a</code></a>
Bump version to 0.8.1 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14655">#14655</a>)</li>
<li><a
href="a378ff38dc"><code>a378ff3</code></a>
[red-knot] Fix Boolean flags in mdtests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14654">#14654</a>)</li>
<li><a
href="d8bca0d3a2"><code>d8bca0d</code></a>
Fix bug where methods defined using lambdas were flagged by FURB118 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14639">#14639</a>)</li>
<li><a
href="6f1cf5b686"><code>6f1cf5b</code></a>
[red-knot] Minor fix in MRO tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/14652">#14652</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.0...0.8.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.8.0&new-version=0.8.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-03 11:35:15 +00:00
Reinier van der Leer 5ccfb8e4c6
dx(backend): Fix linting & formatting for `autogpt_libs` (#8860)
- Resolves #8859
- Follow-up to #8751

### Changes
- Add `autogpt_libs` to the backend CI path filter
- Add `ruff format` step for `autogpt_libs` to `linter.py` and
`pre-commit` config
- Run `poetry run format` with the new setup
2024-12-03 11:34:07 +00:00
Nicholas Tindle 96bba3c1bd
fix: specify encoding for file with emoji in it so it loads on windows (#8873)
<!-- Clearly explain the need for these changes: -->

On windows this file load kept crashing stuff on startup so I specified
the encoding

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] Run the app!
2024-12-03 11:17:34 +00:00
Aarushi de1cd6c295
chore(blocks/fal): Use dict instead of Dict (#8855)
Replace Dict with dict

### Changes 🏗️

Replace Dict with dict

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-03 11:09:38 +00:00
Aarushi 3bca279b35
feat(libs): Add API key rate limit middleware (#8850)
Once we release api key feature, we will want to be able to rate limit
as well. This is the foundation for that.
For now it is a blanket rate limit, later we will be able to add tiered
rate limits

### Changes 🏗️

Added new middleware libary in autogpt_libs which contains the logic for
getting the api key, storing it's details in redis and checking how many
requests it's done, how many are left and what the reset time is.

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-12-03 09:25:29 +00:00
Nicholas Tindle 7c2e371f23
docs: huntr no longer is offering a security bounty so remove it (#8872)
<!-- Clearly explain the need for these changes: -->

Huntr isn't offering a security bounty for autogpt at the moment so
remove it in favor of github security adviosories

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

comments out huntr line in case they decide to offer it again in the
future
2024-12-02 21:44:43 +00:00
Abhimanyu Yadav dce9bdd488
Add URL swapping for marketplace based on environment (#8418)
### Fixes #8371

These changes are needed to automatically switch between local and
production marketplace URLs, ensuring the app connects to the correct
environment (dev or prod) without manual intervention.

### Changes 🏗️

1. Swaps marketplace URL based on APP_ENV (dev or prod).
2. Ensures correct URL is used for local or production environments.

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-12-02 12:12:10 +00:00
Reinier van der Leer 30bb9a3d72
dx: Fix dependabot PR/commit titles (#8841)
Dependabot's commit messages are bulky and don't use our commit message
scopes. Although not fully customizable, this partially fixes it.

### Changes 🏗️

- Fix dependabot commit message scopes

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-12-02 10:39:10 +00:00
Kaitlyn Barnard 758edaec9e
Adding Docs for Agent Blocks (#8845)
### Changes 🏗️

Adding docs for Agent Blocks

Co-authored-by: Bently <tomnoon9@gmail.com>
2024-12-02 10:22:37 +00:00
Bently be7f9123bb
feat(blocks): Add jina fact checker block (#8409)
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-12-02 10:33:36 +00:00
Zamil Majdy 5c49fc87fd
refactor(backend): Apply lint on autogpt_lib folder on backend/linter.py (#8751)
linter.py, only applies in the `backend` module, not `autogpt_libs`.

The scope of this PR is to clear this out.

### Changes 🏗️

* Add a linting scope to both the `backend` & `autogpt_libs` modules,
and apply the linter.


### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-12-02 09:57:53 +00:00
Zamil Majdy 2121ffd06b chore(platform): Bump version to v0.3.4 2024-12-02 09:08:19 +07:00
Zamil Majdy 0c2940353f hotfix(backend): Fix month credit calculation on December (#8851) 2024-12-02 09:00:01 +07:00
Zamil Majdy d26105d382
hotfix(backend): Fix month credit calculation on December (#8851)
When calculating the next month, we are not rolling the month number
causing an error on credits.

### Changes 🏗️

Add modulo while calculating next month.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-12-02 08:47:40 +07:00
dependabot[bot] 7d48eebc78
build(deps): bump pydantic from 2.9.2 to 2.10.2 in /autogpt_platform/autogpt_libs in the production-dependencies group across 1 directory (#8787)
Bumps the production-dependencies group with 1 update in the
/autogpt_platform/autogpt_libs directory:
[pydantic](https://github.com/pydantic/pydantic).

Updates `pydantic` from 2.9.2 to 2.10.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h2>v2.10.2 2024-11-26</h2>
<h2>What's Changed</h2>
<h3>Fixes</h3>
<ul>
<li>Only evaluate <code>FieldInfo</code> annotations if required during
schema building by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10769">#10769</a></li>
<li>Do not evaluate annotations for private fields by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10962">#10962</a></li>
<li>Support serialization as any for <code>Secret</code> types and
<code>Url</code> types by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10947">#10947</a></li>
<li>Fix type hint of <code>Field.default</code> to be compatible with
Python 3.8 and 3.9 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10972">#10972</a></li>
<li>Add hashing support for URL types by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10975">#10975</a></li>
<li>Hide <code>BaseModel.__replace__</code> definition from type
checkers by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10979">10979</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.10.1...v2.10.2">https://github.com/pydantic/pydantic/compare/v2.10.1...v2.10.2</a></p>
<h2>v2.10.1 2024-11-21</h2>
<h2>What's Changed</h2>
<h3>Packaging</h3>
<ul>
<li>Bump <code>pydantic-core</code> version to <code>v2.27.1</code> by
<a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10938">#10938</a></li>
</ul>
<h3>Fixes</h3>
<ul>
<li>Use the correct frame when instantiating a parametrized
<code>TypeAdapter</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10893">#10893</a></li>
<li>Relax check for validated data in <code>default_factory</code> utils
by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10909">#10909</a></li>
<li>Fix type checking issue with <code>model_fields</code> and
<code>model_computed_fields</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10911">#10911</a></li>
<li>Use the parent configuration during schema generation for stdlib
<code>dataclass</code>es by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10928">#10928</a></li>
<li>Use the <code>globals</code> of the function when evaluating the
return type of serializers and <code>computed_field</code>s by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10929">#10929</a></li>
<li>Fix URL constraint application by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10922">#10922</a></li>
<li>Fix URL equality with different validation methods by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10934">#10934</a></li>
<li>Fix JSON schema title when specified as <code>''</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10936">#10936</a></li>
<li>Fix <code>python</code> mode serialization for <code>complex</code>
inference by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1549">pydantic-core#1549</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.10.0...v2.10.1">https://github.com/pydantic/pydantic/compare/v2.10.0...v2.10.1</a></p>
<h2>v2.10.0 2024-11-20</h2>
<p>The code released in v2.10.0 is practically identical to that of
v2.10.0b2.
See the <a
href="https://pydantic.dev/articles/pydantic-v2-10-release">v2.10
release blog post</a> for the highlights!</p>
<h2>What's Changed</h2>
<h3>Packaging</h3>
<ul>
<li>Bump <code>pydantic-core</code> to <code>v2.27.0</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10825">#10825</a></li>
<li>Replaced pdm with uv by <a
href="https://github.com/frfahim"><code>@​frfahim</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10727">#10727</a></li>
</ul>
<h3>New Features</h3>
<ul>
<li>Support <code>fractions.Fraction</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10318">#10318</a></li>
<li>Support <code>Hashable</code> for json validation by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10324">#10324</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.10.2 (2024-11-25)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.10.2">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Fixes</h4>
<ul>
<li>Only evaluate FieldInfo annotations if required during schema
building by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10769">#10769</a></li>
<li>Do not evaluate annotations for private fields by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10962">#10962</a></li>
<li>Support serialization as any for <code>Secret</code> types and
<code>Url</code> types by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10947">#10947</a></li>
<li>Fix type hint of <code>Field.default</code> to be compatible with
Python 3.8 and 3.9 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10972">#10972</a></li>
<li>Add hashing support for URL types by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10975">#10975</a></li>
<li>Hide <code>BaseModel.__replace__</code> definition from type
checkers by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10979">10979</a></li>
</ul>
<h2>v2.10.1 (2024-11-21)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.10.1">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Bump <code>pydantic-core</code> version to <code>v2.27.1</code> by
<a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10938">#10938</a></li>
</ul>
<h4>Fixes</h4>
<ul>
<li>Use the correct frame when instantiating a parametrized
<code>TypeAdapter</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10893">#10893</a></li>
<li>Relax check for validated data in <code>default_factory</code> utils
by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10909">#10909</a></li>
<li>Fix type checking issue with <code>model_fields</code> and
<code>model_computed_fields</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10911">#10911</a></li>
<li>Use the parent configuration during schema generation for stdlib
<code>dataclass</code>es by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10928">#10928</a></li>
<li>Use the <code>globals</code> of the function when evaluating the
return type of serializers and <code>computed_field</code>s by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10929">#10929</a></li>
<li>Fix URL constraint application by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10922">#10922</a></li>
<li>Fix URL equality with different validation methods by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10934">#10934</a></li>
<li>Fix JSON schema title when specified as <code>''</code> by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10936">#10936</a></li>
<li>Fix <code>python</code> mode serialization for <code>complex</code>
inference by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1549">pydantic-core#1549</a></li>
</ul>
<h3>New Contributors</h3>
<h2>v2.10.0 (2024-11-20)</h2>
<p>The code released in v2.10.0 is practically identical to that of
v2.10.0b2.</p>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.10.0">GitHub
release</a></p>
<p>See the <a
href="https://pydantic.dev/articles/pydantic-v2-10-release">v2.10
release blog post</a> for the highlights!</p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="fe32515498"><code>fe32515</code></a>
Prepare for v2.10.2 release (<a
href="https://redirect.github.com/pydantic/pydantic/issues/10982">#10982</a>)</li>
<li><a
href="226cfaf62b"><code>226cfaf</code></a>
Hide <code>BaseModel.__replace__</code> definition from type checkers
(<a
href="https://redirect.github.com/pydantic/pydantic/issues/10979">#10979</a>)</li>
<li><a
href="02229a6ab1"><code>02229a6</code></a>
hashing support for urls (<a
href="https://redirect.github.com/pydantic/pydantic/issues/10975">#10975</a>)</li>
<li><a
href="a9cf39c32a"><code>a9cf39c</code></a>
Fix type hint of <code>Field.default</code> to be compatible with Python
3.8 and 3.9 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/1">#1</a>...</li>
<li><a
href="869eafd70b"><code>869eafd</code></a>
Support serialization as any for <code>Secret</code> types and
<code>Url</code> types (<a
href="https://redirect.github.com/pydantic/pydantic/issues/10947">#10947</a>)</li>
<li><a
href="7c0ed72aa2"><code>7c0ed72</code></a>
Do not evaluate annotations for private fields (<a
href="https://redirect.github.com/pydantic/pydantic/issues/10962">#10962</a>)</li>
<li><a
href="d6fc7fce7d"><code>d6fc7fc</code></a>
Only evaluate <code>FieldInfo</code> annotations if required during
schema building (<a
href="https://redirect.github.com/pydantic/pydantic/issues/10">#10</a>...</li>
<li><a
href="17e60fafd6"><code>17e60fa</code></a>
spacing</li>
<li><a
href="369b355dba"><code>369b355</code></a>
remove typo</li>
<li><a
href="4c75404d6b"><code>4c75404</code></a>
Prepare for v2.10.1 release (<a
href="https://redirect.github.com/pydantic/pydantic/issues/10939">#10939</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic/compare/v2.9.2...v2.10.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pydantic&package-manager=pip&previous-version=2.9.2&new-version=2.10.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-29 13:41:00 +00:00
dependabot[bot] c6b36fbad7
build(deps): bump the production-dependencies group in /autogpt_platform/market with 2 updates (#8757)
Bumps the production-dependencies group in /autogpt_platform/market with
2 updates: [uvicorn](https://github.com/encode/uvicorn) and
[sentry-sdk](https://github.com/getsentry/sentry-python).

Updates `uvicorn` from 0.32.0 to 0.32.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/encode/uvicorn/releases">uvicorn's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.32.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Enable httptools lenient data by <a
href="https://github.com/vvanglro"><code>@​vvanglro</code></a> in <a
href="https://redirect.github.com/encode/uvicorn/pull/2488">encode/uvicorn#2488</a></li>
<li>Drop ASGI spec version to 2.3 on HTTP scope by <a
href="https://github.com/Kludex"><code>@​Kludex</code></a> in <a
href="https://redirect.github.com/encode/uvicorn/pull/2513">encode/uvicorn#2513</a></li>
</ul>
<hr />
<p><strong>Full Changelog</strong>: <a
href="https://github.com/encode/uvicorn/compare/0.32.0...0.32.1">https://github.com/encode/uvicorn/compare/0.32.0...0.32.1</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/encode/uvicorn/blob/master/CHANGELOG.md">uvicorn's
changelog</a>.</em></p>
<blockquote>
<h2>0.32.1 (2024-11-20)</h2>
<h3>Fixed</h3>
<ul>
<li>Drop ASGI spec version to 2.3 on HTTP scope <a
href="https://redirect.github.com/encode/uvicorn/pull/2513">#2513</a></li>
<li>Enable httptools lenient data on <code>httptools &gt;= 0.6.3</code>
<a
href="https://redirect.github.com/encode/uvicorn/pull/2488">#2488</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5279296e62"><code>5279296</code></a>
Upgrade upload/download GitHub Actions (<a
href="https://redirect.github.com/encode/uvicorn/issues/2517">#2517</a>)</li>
<li><a
href="8c3402dd22"><code>8c3402d</code></a>
Update <code>publish.yaml</code> with latest PyPI recommendations (<a
href="https://redirect.github.com/encode/uvicorn/issues/2516">#2516</a>)</li>
<li><a
href="04c6320f39"><code>04c6320</code></a>
Version 0.32.1 (<a
href="https://redirect.github.com/encode/uvicorn/issues/2515">#2515</a>)</li>
<li><a
href="fc6c51b8bb"><code>fc6c51b</code></a>
Drop ASGI spec version to 2.3 on HTTP (<a
href="https://redirect.github.com/encode/uvicorn/issues/2513">#2513</a>)</li>
<li><a
href="2aea8354ea"><code>2aea835</code></a>
fix(http): enable httptools lenient data (<a
href="https://redirect.github.com/encode/uvicorn/issues/2488">#2488</a>)</li>
<li>See full diff in <a
href="https://github.com/encode/uvicorn/compare/0.32.0...0.32.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `sentry-sdk` from 2.18.0 to 2.19.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/releases">sentry-sdk's
releases</a>.</em></p>
<blockquote>
<h2>2.19.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>New: introduce <code>rust_tracing</code> integration. See <a
href="https://docs.sentry.io/platforms/python/integrations/rust_tracing/">https://docs.sentry.io/platforms/python/integrations/rust_tracing/</a>
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3717">#3717</a>)
by <a
href="https://github.com/matt-codecov"><code>@​matt-codecov</code></a></li>
<li>Auto enable Litestar integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3540">#3540</a>)
by <a
href="https://github.com/provinzkraut"><code>@​provinzkraut</code></a></li>
<li>Deprecate <code>sentry_sdk.init</code> context manager (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3729">#3729</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>feat(spotlight): Send PII to Spotlight when no DSN is set (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3804">#3804</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>feat(spotlight): Add info logs when Sentry is enabled (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3735">#3735</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>feat(spotlight): Inject Spotlight button on Django (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3751">#3751</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>feat(spotlight): Auto enable cache_spans for Spotlight on DEBUG (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3791">#3791</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>fix(logging): Handle parameter <code>stack_info</code> for the
<code>LoggingIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3745">#3745</a>)
by <a
href="https://github.com/gmcrocetti"><code>@​gmcrocetti</code></a></li>
<li>fix(pure-eval): Make sentry-sdk[pure-eval] installable with
pip==24.0 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3757">#3757</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>fix(rust_tracing): include_tracing_fields arg to control unvetted
data in rust_tracing integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3780">#3780</a>)
by <a
href="https://github.com/matt-codecov"><code>@​matt-codecov</code></a></li>
<li>fix(aws) Fix aws lambda tests (by reducing event size) (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3770">#3770</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>fix(arq): fix integration with Worker settings as a dict (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3742">#3742</a>)
by <a
href="https://github.com/saber-solooki"><code>@​saber-solooki</code></a></li>
<li>fix(httpx): Prevent Sentry baggage duplication (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3728">#3728</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>fix(falcon): Don't exhaust request body stream (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3768">#3768</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>fix(integrations): Check <code>retries_left</code> before capturing
exception (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3803">#3803</a>)
by <a
href="https://github.com/malkovro"><code>@​malkovro</code></a></li>
<li>fix(openai): Use name instead of description (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3807">#3807</a>)
by <a
href="https://github.com/sourceful-rob"><code>@​sourceful-rob</code></a></li>
<li>test(gcp): Only run GCP tests when they should (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3721">#3721</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>chore: Shorten CI workflow names (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3805">#3805</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>chore: Test with pyspark prerelease (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3760">#3760</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>build(deps): bump codecov/codecov-action from 4.6.0 to 5.0.2 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3792">#3792</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>build(deps): bump actions/checkout from 4.2.1 to 4.2.2 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3691">#3691</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md">sentry-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>2.19.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>New: introduce <code>rust_tracing</code> integration. See <a
href="https://docs.sentry.io/platforms/python/integrations/rust_tracing/">https://docs.sentry.io/platforms/python/integrations/rust_tracing/</a>
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3717">#3717</a>)
by <a
href="https://github.com/matt-codecov"><code>@​matt-codecov</code></a></li>
<li>Auto enable Litestar integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3540">#3540</a>)
by <a
href="https://github.com/provinzkraut"><code>@​provinzkraut</code></a></li>
<li>Deprecate <code>sentry_sdk.init</code> context manager (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3729">#3729</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>feat(spotlight): Send PII to Spotlight when no DSN is set (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3804">#3804</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>feat(spotlight): Add info logs when Sentry is enabled (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3735">#3735</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>feat(spotlight): Inject Spotlight button on Django (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3751">#3751</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>feat(spotlight): Auto enable cache_spans for Spotlight on DEBUG (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3791">#3791</a>)
by <a href="https://github.com/BYK"><code>@​BYK</code></a></li>
<li>fix(logging): Handle parameter <code>stack_info</code> for the
<code>LoggingIntegration</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3745">#3745</a>)
by <a
href="https://github.com/gmcrocetti"><code>@​gmcrocetti</code></a></li>
<li>fix(pure-eval): Make sentry-sdk[pure-eval] installable with
pip==24.0 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3757">#3757</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>fix(rust_tracing): include_tracing_fields arg to control unvetted
data in rust_tracing integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3780">#3780</a>)
by <a
href="https://github.com/matt-codecov"><code>@​matt-codecov</code></a></li>
<li>fix(aws) Fix aws lambda tests (by reducing event size) (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3770">#3770</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>fix(arq): fix integration with Worker settings as a dict (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3742">#3742</a>)
by <a
href="https://github.com/saber-solooki"><code>@​saber-solooki</code></a></li>
<li>fix(httpx): Prevent Sentry baggage duplication (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3728">#3728</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>fix(falcon): Don't exhaust request body stream (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3768">#3768</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>fix(integrations): Check <code>retries_left</code> before capturing
exception (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3803">#3803</a>)
by <a
href="https://github.com/malkovro"><code>@​malkovro</code></a></li>
<li>fix(openai): Use name instead of description (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3807">#3807</a>)
by <a
href="https://github.com/sourceful-rob"><code>@​sourceful-rob</code></a></li>
<li>test(gcp): Only run GCP tests when they should (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3721">#3721</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>chore: Shorten CI workflow names (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3805">#3805</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>chore: Test with pyspark prerelease (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3760">#3760</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>build(deps): bump codecov/codecov-action from 4.6.0 to 5.0.2 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3792">#3792</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>build(deps): bump actions/checkout from 4.2.1 to 4.2.2 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3691">#3691</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="039c220bcb"><code>039c220</code></a>
Updated changelog</li>
<li><a
href="c83e7428f4"><code>c83e742</code></a>
release: 2.19.0</li>
<li><a
href="8fe5bb4b19"><code>8fe5bb4</code></a>
feat: Send PII to Spotlight when no DSN is set (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3804">#3804</a>)</li>
<li><a
href="295dd8d50f"><code>295dd8d</code></a>
Auto enable Litestar integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3540">#3540</a>)</li>
<li><a
href="bd50c38652"><code>bd50c38</code></a>
fix(httpx): Prevent Sentry baggage duplication (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3728">#3728</a>)</li>
<li><a
href="e9ec6c1812"><code>e9ec6c1</code></a>
test(gcp): Only run GCP tests when they should (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3721">#3721</a>)</li>
<li><a
href="aa6e8fd05c"><code>aa6e8fd</code></a>
fix(falcon): Don't exhaust request body stream (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3768">#3768</a>)</li>
<li><a
href="3e2885322a"><code>3e28853</code></a>
fix(integrations): Check retries_left before capturing exception (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3803">#3803</a>)</li>
<li><a
href="01146bd3ad"><code>01146bd</code></a>
fix(openai): Use name instead of description (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3807">#3807</a>)</li>
<li><a
href="d894fc2320"><code>d894fc2</code></a>
Shorten CI workflow names (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3805">#3805</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-python/compare/2.18.0...2.19.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-29 13:38:30 +00:00
Toran Bruce Richards 4aa5f53710
feat(block): Add AI video generator block with Fal txt 2 vid (#8528)
### Background

Implements an AI Video Generator Block for text to image models hosted
on Fal


![image](https://github.com/user-attachments/assets/9cb70015-4174-4419-8c1a-4144f324442f)

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Aarushi <aarushik93@gmail.com>
2024-11-29 13:04:42 +00:00
Nicholas Tindle 75f9b072a6
refactor(backend): Rename & move `IntegrationCredentialsStore` to backend (#8648)
- Move `autogpt_libs.supabase_integration_credentials_store` into
`backend`
   - `.store` -> `backend.integrations.credentials_store`
   - `.types` -> added to `backend.data.model`
- Rename `SupabaseIntegrationCredentialsStore` to
`IntegrationCredentialsStore`

We wanted to get a few security things in quickly in #8403 and had to
make some compromises to do so. This picks those up and fixes them.

- Resolves #8540

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-29 11:48:04 +00:00
Zamil Majdy 63af42dafb
fix(backend): Fix conn_retry decorator possible incorrect behaviour on failed async function (#8836)
This fix is triggered by an error observed on db connection failure on
SupaBase:
```
2024-11-28 07:45:24,724 INFO  [DatabaseManager] Starting...
2024-11-28 07:45:24,726 INFO  [PID-18|DatabaseManager|Prisma-7f32369c-6432-4edb-8e71-ef820332b9e4] Acquiring connection started...
2024-11-28 07:45:24,726 INFO  [PID-18|DatabaseManager|Prisma-7f32369c-6432-4edb-8e71-ef820332b9e4] Acquiring connection completed successfully.
{"is_panic":false,"message":"Can't reach database server at `...pooler.supabase.com:5432`\n\nPlease make sure your database server is running at `....pooler.supabase.com:5432`.","meta":{"database_host":"...pooler.supabase.com","database_port":5432},"error_code":"P1001"}
2024-11-28 07:45:35,153 INFO  [PID-18|DatabaseManager|Prisma-7f32369c-6432-4edb-8e71-ef820332b9e4] Acquiring connection failed: Could not connect to the query engine. Retrying now...
2024-11-28 07:45:36,155 INFO  [PID-18|DatabaseManager|Redis-e14a33de-2d81-4536-b48b-a8aa4b1f4766] Acquiring connection started...
2024-11-28 07:45:36,181 INFO  [PID-18|DatabaseManager|Redis-e14a33de-2d81-4536-b48b-a8aa4b1f4766] Acquiring connection completed successfully.
2024-11-28 07:45:36,183 INFO  [PID-18|DatabaseManager|Pyro-2722cd29-4dbd-4cf9-882f-73842658599d] Starting Pyro Service started...
2024-11-28 07:45:36,189 INFO  [DatabaseManager] Connected to Pyro; URI = PYRO:DatabaseManager@0.0.0.0:8005
2024-11-28 07:46:28,241 ERROR  Error in get_user_integrations: All connection attempts failed
```

Where  even 
```
2024-11-28 07:45:35,153 INFO  [PID-18|DatabaseManager|Prisma-7f32369c-6432-4edb-8e71-ef820332b9e4] Acquiring connection failed: Could not connect to the query engine. Retrying now...
```
is present, the Redis connection is still proceeding without waiting for
the retry to complete. This was likely caused by Tenacity not fully
awaiting the DB connection acquisition command.

### Changes 🏗️

* Add special handling for the async function to explicitly await the
function execution result on each retry.
* Explicitly raise exceptions on `db.connect()` if the db is not
connected even after `prisma.connect()` command.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-11-29 09:30:36 +00:00
Aarushi 29f177e70d
feat(blocks): Add Hubspot blocks (#8786)
This PR adds the first few Hubspot blocks so we can create _real_ sales
and marketing agents.

### Changes 🏗️

Added Hubspot blocks; 

- Aded auth for hubspot
- Added Company block
- Added Contact block
- Added Engagement block

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-11-29 09:26:43 +00:00
Zamil Majdy eeb5b4aa46
fix(backend): Fix `credentials` cost filter not able to filter the block cost (#8837)
We've started enabling cost based on the *partial value* of the
`credentials` field. And this logic has never been supported.

### Changes 🏗️

* Add partial object matching on the input data filter for evaluating
the block cost.
* Add missing credentials for `ExtractWebsiteContentBlock`
* Removed fallback cost on LLM blocks.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-11-29 08:46:33 +00:00
Zamil Majdy 520b1d7940
feat(frontend): Make Block description searchable on Block list palette (#8839)
Blocks should be easy to search, the name is sometimes not
straightforward, but the description does.

<img width="576" alt="image"
src="https://github.com/user-attachments/assets/0528b019-0ebc-4e6f-8a3c-40323a671b13">


### Changes 🏗️

Make the block description searchable.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2024-11-29 07:41:02 +00:00
Zamil Majdy f8b00e55d0 Merge branch 'dev' of github.com:Significant-Gravitas/AutoGPT into dev 2024-11-28 13:22:03 +07:00
Aarushi 4b8087c067
feat(platform/featureflags): Setting up feature flagging (#8718)
* feature flag formatting and linting

* add tests

* update poetry lock

* remove unneeded changes

* fix pyproject

* fix formatting and linting

* pydantic settings

* address comments and format

* alphabetize

* fix lockfile

* fix conflicts
2024-11-27 19:29:57 +00:00
Reinier van der Leer d2f3f53f57
fix(frontend): Fix missing credentials input when no credentials available (#8834)
Fixes breakage from f1414550 (#8772)

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-27 18:26:15 +00:00
Reinier van der Leer ab3643388f
ci: Fix workflow status checker script to work with `merge_group` runs 2024-11-27 19:25:36 +01:00
Reinier van der Leer 845c8c51e5
ci: Add `merge_group` trigger to status checker 2024-11-27 19:15:28 +01:00
Reinier van der Leer 118fdeeb1d
ci: Add `merge_group` triggers 2024-11-27 17:57:44 +01:00
dependabot[bot] 97d00455ef
chore(backend): Update 12 dependencies (#8763)
* build(deps): bump the production-dependencies group across 1 directory with 12 updates

Bumps the production-dependencies group with 12 updates in the /autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [aio-pika](https://github.com/mosquito/aio-pika) | `9.4.3` | `9.5.0` |
| [apscheduler](https://github.com/agronholm/apscheduler) | `3.10.4` | `3.11.0` |
| [fastapi](https://github.com/fastapi/fastapi) | `0.115.4` | `0.115.5` |
| [google-api-python-client](https://github.com/googleapis/google-api-python-client) | `2.151.0` | `2.154.0` |
| [groq](https://github.com/groq/groq-python) | `0.11.0` | `0.12.0` |
| [ollama](https://github.com/ollama/ollama-python) | `0.3.3` | `0.4.1` |
| [openai](https://github.com/openai/openai-python) | `1.54.3` | `1.55.1` |
| [pydantic](https://github.com/pydantic/pydantic) | `2.9.2` | `2.10.1` |
| [sentry-sdk](https://github.com/getsentry/sentry-python) | `2.18.0` | `2.19.0` |
| [uvicorn](https://github.com/encode/uvicorn) | `0.32.0` | `0.32.1` |
| [replicate](https://github.com/replicate/replicate-python) | `1.0.3` | `1.0.4` |
| [pinecone](https://github.com/pinecone-io/pinecone-python-client) | `5.3.1` | `5.4.0` |



Updates `aio-pika` from 9.4.3 to 9.5.0
- [Release notes](https://github.com/mosquito/aio-pika/releases)
- [Changelog](https://github.com/mosquito/aio-pika/blob/master/CHANGELOG.md)
- [Commits](https://github.com/mosquito/aio-pika/compare/9.4.3...9.5.0)

Updates `apscheduler` from 3.10.4 to 3.11.0
- [Release notes](https://github.com/agronholm/apscheduler/releases)
- [Changelog](https://github.com/agronholm/apscheduler/blob/3.11.0/docs/versionhistory.rst)
- [Commits](https://github.com/agronholm/apscheduler/compare/3.10.4...3.11.0)

Updates `fastapi` from 0.115.4 to 0.115.5
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.115.4...0.115.5)

Updates `google-api-python-client` from 2.151.0 to 2.154.0
- [Release notes](https://github.com/googleapis/google-api-python-client/releases)
- [Commits](https://github.com/googleapis/google-api-python-client/compare/v2.151.0...v2.154.0)

Updates `groq` from 0.11.0 to 0.12.0
- [Release notes](https://github.com/groq/groq-python/releases)
- [Changelog](https://github.com/groq/groq-python/blob/main/CHANGELOG.md)
- [Commits](https://github.com/groq/groq-python/compare/v0.11.0...v0.12.0)

Updates `ollama` from 0.3.3 to 0.4.1
- [Release notes](https://github.com/ollama/ollama-python/releases)
- [Commits](https://github.com/ollama/ollama-python/compare/v0.3.3...v0.4.1)

Updates `openai` from 1.54.3 to 1.55.1
- [Release notes](https://github.com/openai/openai-python/releases)
- [Changelog](https://github.com/openai/openai-python/blob/main/CHANGELOG.md)
- [Commits](https://github.com/openai/openai-python/compare/v1.54.3...v1.55.1)

Updates `pydantic` from 2.9.2 to 2.10.1
- [Release notes](https://github.com/pydantic/pydantic/releases)
- [Changelog](https://github.com/pydantic/pydantic/blob/main/HISTORY.md)
- [Commits](https://github.com/pydantic/pydantic/compare/v2.9.2...v2.10.1)

Updates `sentry-sdk` from 2.18.0 to 2.19.0
- [Release notes](https://github.com/getsentry/sentry-python/releases)
- [Changelog](https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-python/compare/2.18.0...2.19.0)

Updates `uvicorn` from 0.32.0 to 0.32.1
- [Release notes](https://github.com/encode/uvicorn/releases)
- [Changelog](https://github.com/encode/uvicorn/blob/master/CHANGELOG.md)
- [Commits](https://github.com/encode/uvicorn/compare/0.32.0...0.32.1)

Updates `replicate` from 1.0.3 to 1.0.4
- [Release notes](https://github.com/replicate/replicate-python/releases)
- [Commits](https://github.com/replicate/replicate-python/compare/1.0.3...1.0.4)

Updates `pinecone` from 5.3.1 to 5.4.0
- [Release notes](https://github.com/pinecone-io/pinecone-python-client/releases)
- [Changelog](https://github.com/pinecone-io/pinecone-python-client/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pinecone-io/pinecone-python-client/compare/v5.3.1...v5.4.0)

---
updated-dependencies:
- dependency-name: aio-pika
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: apscheduler
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: fastapi
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: google-api-python-client
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: groq
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: ollama
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: openai
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: pydantic
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: sentry-sdk
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: uvicorn
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: replicate
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: pinecone
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>

* Downgrade pydantic & pinecode

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-27 16:49:23 +00:00
Zamil Majdy ae9bd87161
fix(backend): Spin-up Database manager on rest.py (#8832) 2024-11-27 16:39:08 +00:00
dependabot[bot] a556995d1f
chore(frontend): Update 5 dependencies (#8755)
build(deps): bump the production-dependencies group

Bumps the production-dependencies group in /autogpt_platform/frontend with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [@sentry/nextjs](https://github.com/getsentry/sentry-javascript) | `8.38.0` | `8.40.0` |
| [cookie](https://github.com/jshttp/cookie) | `1.0.1` | `1.0.2` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.3.2` | `9.4.0` |
| [react-shepherd](https://github.com/shepherd-pro/shepherd) | `6.1.4` | `6.1.6` |
| [tailwind-merge](https://github.com/dcastil/tailwind-merge) | `2.5.4` | `2.5.5` |


Updates `@sentry/nextjs` from 8.38.0 to 8.40.0
- [Release notes](https://github.com/getsentry/sentry-javascript/releases)
- [Changelog](https://github.com/getsentry/sentry-javascript/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-javascript/compare/8.38.0...8.40.0)

Updates `cookie` from 1.0.1 to 1.0.2
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v1.0.1...v1.0.2)

Updates `react-day-picker` from 9.3.2 to 9.4.0
- [Release notes](https://github.com/gpbl/react-day-picker/releases)
- [Changelog](https://github.com/gpbl/react-day-picker/blob/main/CHANGELOG.md)
- [Commits](https://github.com/gpbl/react-day-picker/compare/v9.3.2...v9.4.0)

Updates `react-shepherd` from 6.1.4 to 6.1.6
- [Release notes](https://github.com/shepherd-pro/shepherd/releases)
- [Changelog](https://github.com/shipshapecode/shepherd/blob/main/CHANGELOG.md)
- [Commits](https://github.com/shepherd-pro/shepherd/commits)

Updates `tailwind-merge` from 2.5.4 to 2.5.5
- [Release notes](https://github.com/dcastil/tailwind-merge/releases)
- [Commits](https://github.com/dcastil/tailwind-merge/compare/v2.5.4...v2.5.5)

---
updated-dependencies:
- dependency-name: "@sentry/nextjs"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: cookie
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: react-day-picker
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: react-shepherd
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: tailwind-merge
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-27 16:33:21 +00:00
Nicholas Tindle fd6c1d9f4f
feat(blocks): Various block QoL improvements (#8749)
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Resolves #8357
Resolves #8738
2024-11-27 16:18:53 +00:00
dependabot[bot] 14cc21a843
chore(frontend): Update 14 dev dependencies (#8756)
build(deps-dev): bump the development-dependencies group across 1 directory with 16 updates

Bumps the development-dependencies group with 14 updates in the /autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
| [@playwright/test](https://github.com/microsoft/playwright) | `1.48.2` | `1.49.0` |
| [@storybook/addon-essentials](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/essentials) | `8.4.2` | `8.4.5` |
| [@storybook/addon-interactions](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/interactions) | `8.4.2` | `8.4.5` |
| [@storybook/addon-links](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/links) | `8.4.2` | `8.4.5` |
| [@storybook/addon-onboarding](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/onboarding) | `8.4.2` | `8.4.5` |
| [@storybook/blocks](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/blocks) | `8.4.2` | `8.4.5` |
| [@storybook/nextjs](https://github.com/storybookjs/storybook/tree/HEAD/code/frameworks/nextjs) | `8.4.2` | `8.4.5` |
| [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) | `22.9.0` | `22.9.3` |
| [eslint-plugin-storybook](https://github.com/storybookjs/eslint-plugin-storybook) | `0.11.0` | `0.11.1` |
| [postcss](https://github.com/postcss/postcss) | `8.4.48` | `8.4.49` |
| [prettier-plugin-tailwindcss](https://github.com/tailwindlabs/prettier-plugin-tailwindcss) | `0.6.8` | `0.6.9` |
| [storybook](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/cli) | `8.4.2` | `8.4.5` |
| [tailwindcss](https://github.com/tailwindlabs/tailwindcss) | `3.4.14` | `3.4.15` |
| [typescript](https://github.com/microsoft/TypeScript) | `5.6.3` | `5.7.2` |



Updates `@playwright/test` from 1.48.2 to 1.49.0
- [Release notes](https://github.com/microsoft/playwright/releases)
- [Commits](https://github.com/microsoft/playwright/compare/v1.48.2...v1.49.0)

Updates `@storybook/addon-essentials` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/addons/essentials)

Updates `@storybook/addon-interactions` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/addons/interactions)

Updates `@storybook/addon-links` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/addons/links)

Updates `@storybook/addon-onboarding` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/addons/onboarding)

Updates `@storybook/blocks` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/lib/blocks)

Updates `@storybook/nextjs` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/frameworks/nextjs)

Updates `@storybook/react` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/renderers/react)

Updates `@storybook/test` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/lib/test)

Updates `@types/node` from 22.9.0 to 22.9.3
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Updates `eslint-plugin-storybook` from 0.11.0 to 0.11.1
- [Release notes](https://github.com/storybookjs/eslint-plugin-storybook/releases)
- [Changelog](https://github.com/storybookjs/eslint-plugin-storybook/blob/main/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/eslint-plugin-storybook/compare/v0.11.0...v0.11.1)

Updates `postcss` from 8.4.48 to 8.4.49
- [Release notes](https://github.com/postcss/postcss/releases)
- [Changelog](https://github.com/postcss/postcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/postcss/postcss/compare/8.4.48...8.4.49)

Updates `prettier-plugin-tailwindcss` from 0.6.8 to 0.6.9
- [Release notes](https://github.com/tailwindlabs/prettier-plugin-tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/prettier-plugin-tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/prettier-plugin-tailwindcss/compare/v0.6.8...v0.6.9)

Updates `storybook` from 8.4.2 to 8.4.5
- [Release notes](https://github.com/storybookjs/storybook/releases)
- [Changelog](https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md)
- [Commits](https://github.com/storybookjs/storybook/commits/v8.4.5/code/lib/cli)

Updates `tailwindcss` from 3.4.14 to 3.4.15
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/v3.4.15/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/compare/v3.4.14...v3.4.15)

Updates `typescript` from 5.6.3 to 5.7.2
- [Release notes](https://github.com/microsoft/TypeScript/releases)
- [Changelog](https://github.com/microsoft/TypeScript/blob/main/azure-pipelines.release.yml)
- [Commits](https://github.com/microsoft/TypeScript/compare/v5.6.3...v5.7.2)

---
updated-dependencies:
- dependency-name: "@playwright/test"
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: development-dependencies
- dependency-name: "@storybook/addon-essentials"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/addon-interactions"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/addon-links"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/addon-onboarding"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/blocks"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/nextjs"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/react"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@storybook/test"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: "@types/node"
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: eslint-plugin-storybook
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: postcss
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: prettier-plugin-tailwindcss
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: storybook
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: tailwindcss
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: typescript
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: development-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-27 16:06:25 +00:00
dependabot[bot] 772baff6db
build(deps): bump docker/build-push-action from 2 to 6 (#8465)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 2 to 6.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v2...v6)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-27 22:52:58 +07:00
Nicholas Tindle 5dd151b41e
feat(tests): add baseline utility for integration testing from frontend ui (#8765) 2024-11-27 09:44:19 +00:00
Zamil Majdy 86fbbae65c
fix(frontend): Add text length limit when displaying Graph & Block name with different length in different places (#8746) 2024-11-27 08:37:26 +00:00
Abhimanyu Yadav 6bfe7ff497
fix(frontend): Add integer type definition for node handles (#8803) 2024-11-27 08:16:29 +01:00
dependabot[bot] effd1e35a3
chore(libs): Update dev dependency Ruff from 0.7.4 to 0.8.0 (#8760)
build(deps-dev): bump ruff

Bumps the development-dependencies group in /autogpt_platform/autogpt_libs with 1 update: [ruff](https://github.com/astral-sh/ruff).


Updates `ruff` from 0.7.4 to 0.8.0
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.7.4...0.8.0)

---
updated-dependencies:
- dependency-name: ruff
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: development-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-26 22:43:06 +00:00
Bently 4aae15d769
feat(blocks): Add Word Character Count Block (#8781)
* Adds Word Character Count Block

Co-Authored-By: SerchioSD <69461657+serchiosd@users.noreply.github.com>

* update test_output

---------

Co-authored-by: SerchioSD <69461657+serchiosd@users.noreply.github.com>
2024-11-26 21:38:43 +00:00
oxygen-fragment f62fa3e1e3
updated URL on README.md (#8767)
* docs(backend): Add `--build` to docker command in Getting Started guide (#8762)

* updated URL on README.md

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-11-26 20:49:47 +00:00
Abhimanyu Yadav 708ed9a91c
fix(platform): handle None value in issue body when fetching GitHub issues (#8773)
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-11-26 20:46:13 +00:00
Abhimanyu Yadav 951948d239
fix(platform): allowing condition block to compare 2 strings (#8771)
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Bently <tomnoon9@gmail.com>
2024-11-26 20:40:51 +00:00
Reinier van der Leer f1414550f9
refactor(platform): Combine per-provider credentials API calls (#8772)
- Add `/integrations/credentials` endpoint which lists all credentials for the authenticated user
- Amend credential fetching logic in front end to fetch all at once instead of per provider

- Resolves #8770
- Resolves (hopefully) #8613
2024-11-26 17:03:06 +00:00
dependabot[bot] c6e838da37
chore(market): Update Ruff from 0.7.4 to 0.8.0 (#8758) 2024-11-26 08:12:27 +00:00
Reinier van der Leer 06b403f2b0
docs(backend): Add `--build` to docker command in Getting Started guide (#8762) 2024-11-25 22:51:24 -06:00
dependabot[bot] 03f776681a
build(deps-dev): bump the development-dependencies group in /autogpt_platform/backend with 2 updates (#8761)
build(deps-dev): bump the development-dependencies group

Bumps the development-dependencies group in /autogpt_platform/backend with 2 updates: [poethepoet](https://github.com/nat-n/poethepoet) and [ruff](https://github.com/astral-sh/ruff).


Updates `poethepoet` from 0.30.0 to 0.31.0
- [Release notes](https://github.com/nat-n/poethepoet/releases)
- [Commits](https://github.com/nat-n/poethepoet/compare/v0.30.0...v0.31.0)

Updates `ruff` from 0.7.4 to 0.8.0
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.7.4...0.8.0)

---
updated-dependencies:
- dependency-name: poethepoet
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: development-dependencies
- dependency-name: ruff
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: development-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-25 22:49:45 +00:00
Reinier van der Leer 3d21d54dab
fix(backend): Add missing `strenum` dependency
Follow-up hotfix for #8358
2024-11-25 18:26:13 +00:00
Reinier van der Leer eef9bbe991
feat(platform, blocks): Webhook-triggered blocks (#8358)
- feat(blocks): Add GitHub Pull Request Trigger block

## feat(platform): Add support for Webhook-triggered blocks
- ⚠️ Add `PLATFORM_BASE_URL` setting

- Add webhook config option and `BlockType.WEBHOOK` to `Block`
  - Add check to `Block.__init__` to enforce type and shape of webhook event filter
  - Add check to `Block.__init__` to enforce `payload` input on webhook blocks
  - Add check to `Block.__init__` to disable webhook blocks if `PLATFORM_BASE_URL` is not set

- Add `Webhook` model + CRUD functions in `backend.data.integrations` to represent webhooks created by our system
  - Add `IntegrationWebhook` to DB schema + reference `AgentGraphNode.webhook_id`
    - Add `set_node_webhook(..)` in `backend.data.graph`

- Add webhook-related endpoints:
  - `POST /integrations/{provider}/webhooks/{webhook_id}/ingress` endpoint, to receive webhook payloads, and for all associated nodes create graph executions
    - Add `Node.is_triggered_by_event_type(..)` helper method
  - `POST /integrations/{provider}/webhooks/{webhook_id}/ping` endpoint, to allow testing a webhook
  - Add `WebhookEvent` + pub/sub functions in `backend.data.integrations`

- Add `backend.integrations.webhooks` module, including:
  - `graph_lifecycle_hooks`, e.g. `on_graph_activate(..)`, to handle corresponding webhook creation etc.
    - Add calls to these hooks in the graph create/update endpoints
  - `BaseWebhooksManager` + `GithubWebhooksManager` to handle creating + registering, removing + deregistering, and retrieving existing webhooks, and validating incoming payloads

## Other improvements
- fix(blocks): Allow having an input and output pin with the same name
- fix(blocks): Add tooltip with description in places where block inputs are rendered without `NodeHandle`
- feat(blocks): Allow hiding inputs (e.g. `payload`) with `SchemaField(hidden=True)`
- fix(frontend): Fix `MultiSelector` component styling
- feat(frontend): Add `AlertDialog` UI component
- feat(frontend): Add `NodeMultiSelectInput` component
- feat(backend/data): Add `NodeModel` with `graph_id`, `graph_version`; `GraphModel` with `user_id`
  - Add `make_graph_model(..)` helper function in `backend.data.graph`
- refactor(backend/data): Make `RedisEventQueue` generic and move to `backend.data.execution`
- refactor(frontend): Deduplicate & clean up code for different block types in `generateInputHandles(..)` in `CustomNode`
- dx(backend): Add `MissingConfigError`, `NeedConfirmation` exception

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-25 18:42:36 +01:00
Reinier van der Leer 464b5309d7
fix(forge): Fix double `model` kwarg error in `AnthropicProvider.create_chat_completion(..)` (#8666) 2024-11-25 15:41:50 +00:00
Zamil Majdy f00654cb2c
fix(backend): Fix .env file read contention on pyro connection setup (#8736) 2024-11-25 16:55:52 +07:00
thecosmicmuffet bc8ae1f542
docs(platform): Fix url in `README.md` (#8747) 2024-11-23 02:37:41 +00:00
Reinier van der Leer f2816f98e9
Merge branch 'master' into dev 2024-11-21 18:06:08 +00:00
Abhimanyu Yadav 5ee8b62d67
fix: hide content except login when not authenticated to prevent errors (#8398)
* fix: hide content except login when not authenticated to prevent errors

* Remove supabase folder from tracking

* Remove supabase folder from Git tracking

* adding git submodule

* adding git submodule

* Discard changes to .gitignore

* only showing AutoGPT logo if user is not present

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-11-21 15:57:35 +00:00
Zamil Majdy 8b4bb27077
fix(backend): Re-work the connection input consumption logic for Agent Executor Block (#8710) 2024-11-21 11:05:41 +00:00
Zamil Majdy 6954f4eb0e
fix(backend): Revert non-async routes that are changed to async (#8734) 2024-11-21 11:46:55 +01:00
Zamil Majdy c14ab0c37a
refactor(backend): Remove un-needed join in `fix_llm_provider_credentials` query (#8728) 2024-11-21 09:35:17 +00:00
Reinier van der Leer 13da8af170
chore(platform): Bump version to v0.3.3 2024-11-20 23:25:14 +00:00
Bently 63e3244e7e
feat(platform): Updates to Runner Output UI (#8717)
* feat(platform): Updates to Runner Output UI

* add copy text button to output boxes

* prettier

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-20 19:35:29 +00:00
Simone Busoli 19095be249
docs: replace docker-compose with docker compose (#8502)
Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-20 19:20:46 +00:00
Nicholas Tindle 26a6bd4d10
docs(platform): update docs for security ssrf (#8675) 2024-11-20 15:29:45 +00:00
Nicholas Tindle 92bfbfad57
feat: generate simple auth tests (#8709) 2024-11-20 15:21:16 +00:00
Krzysztof Czerwinski cf43248ab8
feat(frontend): Show Agent Output on Monitor page (#8501)
* Show Output in Monitor

* Updates

* Updates

* Move hardcoded ids to a dedicated enum

---------

Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-20 15:08:22 +00:00
dependabot[bot] aea6e7caed
build(deps): bump the production-dependencies group in /autogpt_platform/frontend with 7 updates (#8703)
build(deps): bump the production-dependencies group

Bumps the production-dependencies group in /autogpt_platform/frontend with 7 updates:

| Package | From | To |
| --- | --- | --- |
| @radix-ui/react-icons | `1.3.1` | `1.3.2` |
| [@radix-ui/react-scroll-area](https://github.com/radix-ui/primitives) | `1.2.0` | `1.2.1` |
| [@radix-ui/react-tooltip](https://github.com/radix-ui/primitives) | `1.1.3` | `1.1.4` |
| [@sentry/nextjs](https://github.com/getsentry/sentry-javascript) | `8.37.1` | `8.38.0` |
| [elliptic](https://github.com/indutny/elliptic) | `6.6.0` | `6.6.1` |
| [lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react) | `0.456.0` | `0.460.0` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.3.0` | `9.3.2` |


Updates `@radix-ui/react-icons` from 1.3.1 to 1.3.2

Updates `@radix-ui/react-scroll-area` from 1.2.0 to 1.2.1
- [Changelog](https://github.com/radix-ui/primitives/blob/main/release-process.md)
- [Commits](https://github.com/radix-ui/primitives/commits)

Updates `@radix-ui/react-tooltip` from 1.1.3 to 1.1.4
- [Changelog](https://github.com/radix-ui/primitives/blob/main/release-process.md)
- [Commits](https://github.com/radix-ui/primitives/commits)

Updates `@sentry/nextjs` from 8.37.1 to 8.38.0
- [Release notes](https://github.com/getsentry/sentry-javascript/releases)
- [Changelog](https://github.com/getsentry/sentry-javascript/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-javascript/compare/8.37.1...8.38.0)

Updates `elliptic` from 6.6.0 to 6.6.1
- [Commits](https://github.com/indutny/elliptic/compare/v6.6.0...v6.6.1)

Updates `lucide-react` from 0.456.0 to 0.460.0
- [Release notes](https://github.com/lucide-icons/lucide/releases)
- [Commits](https://github.com/lucide-icons/lucide/commits/0.460.0/packages/lucide-react)

Updates `react-day-picker` from 9.3.0 to 9.3.2
- [Release notes](https://github.com/gpbl/react-day-picker/releases)
- [Changelog](https://github.com/gpbl/react-day-picker/blob/main/CHANGELOG.md)
- [Commits](https://github.com/gpbl/react-day-picker/compare/v9.3.0...v9.3.2)

---
updated-dependencies:
- dependency-name: "@radix-ui/react-icons"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: "@radix-ui/react-scroll-area"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: "@radix-ui/react-tooltip"
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: "@sentry/nextjs"
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: elliptic
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
- dependency-name: lucide-react
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
- dependency-name: react-day-picker
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-20 17:20:27 +04:00
dependabot[bot] c84cc292f1
build(deps): bump fastapi from 0.115.4 to 0.115.5 in /autogpt_platform/market in the production-dependencies group (#8705)
build(deps): bump fastapi

Bumps the production-dependencies group in /autogpt_platform/market with 1 update: [fastapi](https://github.com/fastapi/fastapi).


Updates `fastapi` from 0.115.4 to 0.115.5
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.115.4...0.115.5)

---
updated-dependencies:
- dependency-name: fastapi
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: production-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-20 17:08:07 +04:00
Swifty d84ddfcf1a
fix(block): Updated model_version to prevent conflicts with pydantic naming (#8729)
changed model_version name to avoid conflicts
2024-11-20 12:51:06 +00:00
Zamil Majdy 5fa5b7104a
fix(frontend): Monitor Page got all the request doubled on each page refresh (#8727)
fix(frontend): Avoid refreshing page on each auth state change event
2024-11-20 10:43:34 +00:00
Reinier van der Leer 33dd2eb919
dx(backend): Fix `pre-commit` `isort` step (#8726)
- Set `tool.isort.profile = "black"`
- Explicitly pass the first-party package name in the `isort` jobs in the `pre-commit` config
2024-11-19 19:51:20 -06:00
Aarushi a5734a57d5
fix(platform): Remove settings endpoint (#8715)
remove settings endpoint
2024-11-19 23:39:40 +00:00
Zamil Majdy 274419d393
fix(backend): Improve typing for blocks StepThroughItemsBlock, CountdownTimerBlock, AddToListBlock, AddToDictionaryBlock (#8713) 2024-11-19 20:53:34 +00:00
Aarushi 520d0ca0e4
delete infra folder (#8555)
* delete infra folder

* remove ci
2024-11-19 14:21:57 +00:00
dependabot[bot] 84076ebee1
build(deps-dev): bump the development-dependencies group in /autogpt_platform/backend with 2 updates (#8699)
build(deps-dev): bump the development-dependencies group

Bumps the development-dependencies group in /autogpt_platform/backend with 2 updates: [ruff](https://github.com/astral-sh/ruff) and [pyright](https://github.com/RobertCraigie/pyright-python).


Updates `ruff` from 0.7.3 to 0.7.4
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.7.3...0.7.4)

Updates `pyright` from 1.1.388 to 1.1.389
- [Release notes](https://github.com/RobertCraigie/pyright-python/releases)
- [Commits](https://github.com/RobertCraigie/pyright-python/compare/v1.1.388...v1.1.389)

---
updated-dependencies:
- dependency-name: ruff
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: pyright
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 13:28:34 +00:00
dependabot[bot] 2e934dfff3
build(deps-dev): bump the development-dependencies group in /autogpt_platform/market with 2 updates (#8706)
build(deps-dev): bump the development-dependencies group

Bumps the development-dependencies group in /autogpt_platform/market with 2 updates: [ruff](https://github.com/astral-sh/ruff) and [pyright](https://github.com/RobertCraigie/pyright-python).


Updates `ruff` from 0.7.3 to 0.7.4
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.7.3...0.7.4)

Updates `pyright` from 1.1.388 to 1.1.389
- [Release notes](https://github.com/RobertCraigie/pyright-python/releases)
- [Commits](https://github.com/RobertCraigie/pyright-python/compare/v1.1.388...v1.1.389)

---
updated-dependencies:
- dependency-name: ruff
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
- dependency-name: pyright
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 13:08:08 +00:00
dependabot[bot] c1c3345bc0
build(deps-dev): bump ruff from 0.7.3 to 0.7.4 in /autogpt_platform/autogpt_libs in the development-dependencies group (#8701)
build(deps-dev): bump ruff

Bumps the development-dependencies group in /autogpt_platform/autogpt_libs with 1 update: [ruff](https://github.com/astral-sh/ruff).


Updates `ruff` from 0.7.3 to 0.7.4
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/ruff/compare/0.7.3...0.7.4)

---
updated-dependencies:
- dependency-name: ruff
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: development-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 10:10:26 +00:00
dependabot[bot] fb9a543e35
build(deps): bump pyjwt from 2.9.0 to 2.10.0 in /autogpt_platform/autogpt_libs in the production-dependencies group (#8700)
build(deps): bump pyjwt

Bumps the production-dependencies group in /autogpt_platform/autogpt_libs with 1 update: [pyjwt](https://github.com/jpadilla/pyjwt).


Updates `pyjwt` from 2.9.0 to 2.10.0
- [Release notes](https://github.com/jpadilla/pyjwt/releases)
- [Changelog](https://github.com/jpadilla/pyjwt/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/jpadilla/pyjwt/compare/2.9.0...2.10.0)

---
updated-dependencies:
- dependency-name: pyjwt
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: production-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 10:06:05 +00:00
Reinier van der Leer e81083d9ab
fix(frontend): Allow importing agent file with empty description (#8670)
Check structure of agent file with `obj[key] != null` rather than `!!obj[key]` to allow empty strings
2024-11-18 23:57:02 +00:00
Coenraad Loubser 865e3c056d
ref(classic): Do not 'rm -rf <unquoted variable>' when removing classic env (#8417)
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-11-18 17:09:10 -06:00
Aarushi 8fccf2eed3
fix(platform/builder): Add heartbeat mechanism (#8665)
* add heartbeat mechanism

* formatting data

* import List

* another import fix

* wip

* formatting adn linting
2024-11-18 16:33:15 +00:00
Reinier van der Leer 1f34f78e4e
build(frontend): Optimize Docker build time and image size (#8695)
This PR reduces image size by 4.9GB (93%) and reduces uncached build time from ~7m to ~5m20s.

- Use cache mount to prevent Yarn cache from being included in `yarn install` layer
- Leverage Next.js output tracing to generate minimal application w/ tree-shaken dependencies
- Add non-root user following the Next.js reference Dockerfile
2024-11-18 15:07:03 +00:00
Toran Bruce Richards 29cff1bb4e
feat(blocks): Add Open Router integration with a large selection of new models (#8653)
* feat: Add Open Router integration credentials

- Added support for Open Router integration credentials in the Supabase integration credentials store.
- Updated the LLM provider field to include "open_router" as a valid provider option.
- Added Open Router API key field to the backend settings.
- Updated the profile page to display the Open Router integration credentials.
- Updated the credentials input and provider components to include Open Router as a provider option.
- Updated the autogpt-server-api types to include "open_router" as a provider name.
- Updated the LLM provider schema to include "open_router" as a valid provider name.

- Added GEMINI_FLASH_1_5_8B as the first Open Router LLM

* Add type ignore to new llm prompt to match the rest of them.

* Update LlmModel with a selection of new OpenRouter models

* format
2024-11-18 14:03:50 +00:00
Zamil Majdy 402789d8cd
fix(frontend): avoid displaying long description text for block (#8688)
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-11-18 13:25:21 +00:00
Zamil Majdy 6fa4b8cb11
fix(backend): Add the lower cap of the user credits to zero (#8682)
fix(backend): Add lower cap of the user credits to zero
2024-11-18 13:12:42 +00:00
Zamil Majdy f36d95aaa8
fix(backend): Avoid falling back to default user unless ENABLED_AUTH is set to False (#8691) 2024-11-18 13:01:21 +00:00
Krzysztof Czerwinski a660833744
refactor(frontend): Update buttons to edit agents in Monitor (#8687)
Update Monitor buttons

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-18 12:56:40 +00:00
Reinier van der Leer e840106949
fix(backend): Resolve Pydantic warning about missing `secrets_dir` (#8692)
- Remove `secrets_dir` and other references to `get_secrets_path()`
- Remove unused `get_config_path()`

Follow-up to #8521, which removed the `secrets` dir but not the references to it.
2024-11-18 12:41:18 +00:00
Reinier van der Leer 6c109adf0b
Merge branch 'master' into dev 2024-11-18 12:32:24 +00:00
Reinier van der Leer bff0dc3d82
chore(platform): Bump version to v0.3.1 2024-11-18 12:11:07 +01:00
Zamil Majdy cd7dfbb8b3
fix(frontend): Typing in the NodeKeyValueInput field causes the field to un-focus (#8680)
* fix(frontend): Typing in the "Prompt Values" input field causes the field to un-focus

* Add comment

* Rephrase
2024-11-18 10:51:54 +00:00
Zamil Majdy a2895a2ca0
fix(backend): Define executionmanager hostname for local docker-mode (#8681) 2024-11-18 09:16:00 +00:00
Craig Perkins e30dac575d
Update links to images in FORGE-QUICKSTART.md (#8517)
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-16 21:23:54 -06:00
Reinier van der Leer 918538147c
fix(backend): Add migrations to fix credentials inputs with invalid provider "llm" (#8674)
In #8524, the "llm" credentials provider was replaced. There are still entries with `"provider": "llm"` in the system though, and those break if not migrated.

- SQL migration to fix the obvious ones where we know the provider from `credentials.id`
- Non-SQL migration to fix the rest
2024-11-16 01:07:05 +01:00
Reinier van der Leer 1e8a272ac6
fix(backend): Add migrations to fix credentials inputs with invalid provider "llm" (vol. 5)
Five times the charm
2024-11-16 00:41:56 +01:00
Reinier van der Leer 1c6890486f
fix(backend): Add migrations to fix credentials inputs with invalid provider "llm" (vol. 4)
Another attempt at unbreaking this raw Prisma query
2024-11-16 00:17:43 +01:00
Reinier van der Leer 29688758c4
fix(backend): Add migrations to fix credentials inputs with invalid provider "llm" (vol. 3)
Fix User table reference in raw SQL queryin non-Prisma migration
2024-11-15 20:56:04 +01:00
Reinier van der Leer 2a66295a92
fix(backend): Add migrations to fix credentials inputs with invalid provider "llm" (vol. 2)
Fix breaking SQL double-casting issue in the SQL migration
2024-11-15 20:32:24 +01:00
Reinier van der Leer 4db8e746d7
fix(backend): Add migrations to fix credentials inputs with invalid provider "llm"(#8674)
In #8524, the "llm" credentials provider was replaced. There are still entries with 	"provider": "llm"	 in the system though, and those break if not migrated.

- SQL migration to fix the obvious ones where we know the provider from `credentials.id`
- Non-SQL migration to fix the rest
2024-11-15 20:18:02 +01:00
Reinier van der Leer 0551bec096
Merge branch 'master' into dev 2024-11-15 15:17:45 +01:00
Toran Bruce Richards bd2f172e6d
tweak(docs): Update Block File Path in Documentation (#8662)
Update new_blocks.md doccumentation
2024-11-15 12:46:31 +00:00
Reinier van der Leer 9a4ff9023d
bump version to v0.3.0 2024-11-15 11:54:42 +01:00
Zamil Majdy 8987fdd48c
feat(backend): Enable json parsing with typing & conversion (#8578) 2024-11-15 17:28:59 +07:00
Zamil Majdy 6a1cea4c4e
fix(backend): Add execution persistence for execution scheduler service (#8649)
* fix(backend): Add execution persistence for execution scheduler service

* scheduler REST API cleanup

* Fix to binary

* Adapt UI with new API

* Remove schedule.py

* Remove unused class

* Fix linting
2024-11-15 11:17:37 +01:00
Zamil Majdy f27f596f58
fix(frontend): Newly typed text in Input fields vanishes on scroll (#8657) 2024-11-15 08:00:59 +00:00
Nicholas Tindle ea214d9168
ci: fix classic ci (#8338)
* ci(frontend,backend,classic): update branch from develop to dev

* ci(frontend, infra): enable ci on other tools

* Update classic-autogpt-docker-ci.yml

* fix: don't error if the folder exists

* fix: drop bad test

* Revert "fix: drop bad test"

This reverts commit c478d3cf4c.

* fix: turn off the correct test 👀

* fix: remove more

* Discard changes to .github/workflows/classic-autogpt-ci.yml

* Update classic-autogpt-docker-ci.yml

* Update classic-autogpt-docker-release.yml

* Update classic-autogpts-ci.yml

* Discard changes to .github/workflows/classic-forge-ci.yml

* Discard changes to .github/workflows/classic-autogpts-ci.yml

* Discard changes to .github/workflows/classic-python-checks.yml

* Discard changes to .github/workflows/repo-pr-label.yml

* Discard changes to .github/workflows/platform-backend-ci.yml

* Update classic-benchmark-ci.yml

* Update classic-frontend-ci.yml
2024-11-15 01:48:00 -06:00
Reinier van der Leer f9633ffb71
Revert "fix(platform): Remove migrate and encrypt function" (#8654)
Reverts c707ee9 (#8646)

The problem analysis that led to #8646 contained some errors, so the migration removed in the PR doesn't seem to have been the cause of the problem we were hunting. Also, this migration is an essential part of the security improvement that we made 2 weeks ago.
2024-11-14 23:34:30 +00:00
Bently e140873dd4
Feat(Builder/tutorial): Updates to fix tutorial (#8655)
Feat(Builder/tutorial): Updates to fix tutorial
2024-11-14 22:25:32 +00:00
Abhimanyu Yadav dd0081ab35
feat(platform) : scheduling agent runner (#8634)
* add: ui for scheduling agent

* adding requests and type for schedule endpoints

* feat : monitor schdules on monitor page

* add: Complete monitor page

* fix filter on monitor page

* fix linting

* PR nits

* Added Docker Compose env var

---------

Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-14 16:18:56 +01:00
Abhimanyu Yadav bbbdb5665b
feat(platform): Add api generator functions and endpoints (#8597)
* add: api generator functions and endpoints

* Rebase onto dev, refactor API manager location, remove suspended key revoke, and update API code for Prisma compatibility

* add: key_manager

* reversing changes og poetry.lock

* add: changing hash mexhansim in API Manager

* add: changing hash mexhansim in API Manager

* fixing some simple bugs

* fix linting and adding better error handling

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-14 14:33:27 +00:00
Toran Bruce Richards e628a25533
feat(blocks): Add `AIImageGeneratorBlock` (#8525)
* feat(block): Add AIImageGeneratorBlock

This commit adds the AIImageGeneratorBlock class to the backend. The AIImageGeneratorBlock is responsible for generating images using various AI models through a unified interface.

* Remove unsupported inputs and add more styles

* Update autogpt_platform/backend/backend/blocks/ai_image_generator_block.py

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>

* run format

* Add test mock

* mock client run

* Refactor AIImageGeneratorBlock to use a separate function for running the client

* Update Credential description

* Rename ModelProvider to ImageGenModel

* Add missing block run function

* fix mock

* .

* Refactor AIImageGeneratorBlock to move run_client function inside class

* Fix broken reference to run client and tidy code.

* Refactor AIImageGeneratorBlock to improve code structure and error handling

* Move client into run client instantiation function.

* Refactor AIImageGeneratorBlock to handle output as FileOutput and improve error handling

* run format

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-11-14 12:28:24 +00:00
Nicholas Tindle 52b3148196
feat(frontend): check auth before allowing actions to run (#8633) 2024-11-14 09:45:31 +00:00
Toran Bruce Richards 05c76738a4
tweak(frontend): Add jina and unreal to hidden credentials list in frontend (#8642)
Adds missing hidden credentials
2024-11-14 09:33:00 +00:00
vishesh10 639242ac68
Add provision for other languages in Youtube Video Block (#8630)
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-11-14 09:20:46 +00:00
Zamil Majdy ce667f6287
feat(frontend): Center initial canvas (#8644)
* fix(frontend): Fix client-side validation for Agent Executor Block

* Fix zoom scale calculation

* Fix zoom scale calculation
2024-11-14 09:15:30 +00:00
Zamil Majdy 98ab525e39
fix(frontend): Fix input-field update on empty & default value (#8647)
* fix(frontend): Fix input-field update on empty & default value

* Fix error message

* Revert
2024-11-14 09:03:12 +00:00
Aarushi c707ee9cb6
fix(platform): Remove migrate and encrypt function (#8646)
remove migrate and encrypt function
2024-11-13 23:09:26 +00:00
Nicholas Tindle b64c536eca
Create SECURITY.md (#8645) 2024-11-13 22:28:52 +00:00
Reinier van der Leer 5c0f979b9c
fix(frontend): Remove double title on credentials input (#8638)
- Add condition to hide `credentials` input title in `CustomNode:generateInputHandles`
- Add `title={schema.description}` to `<CredentialsInput>` title element
2024-11-13 19:37:55 +00:00
Nicholas Tindle b048385091
fix(classic): update docs for security deprecation (#8632)
* Create README.md

* Update README.md

* Update index.md

* Update README.md

* Update index.md

* Update index.md
2024-11-13 18:52:21 +00:00
Zamil Majdy 67244759c7
fix(frontend): Fix client-side validation for Agent Executor Block (#8643)
* feat(frontend): Center initial canvas & add option to open graph on agent executor blok

* Removed unused variable
2024-11-13 18:31:22 +00:00
Kaitlyn Barnard a3655b8a85
Adding Google Analytics to docs site (#8640)
Adding GA tag to docs site

Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
2024-11-13 17:03:28 +00:00
Toran Bruce Richards aafc101224
tweak(backend): Update all block costs (#8639)
* Add support for default credentials to unreal block

* Refactor block cost configuration and add new blocks

This commit refactors the block cost configuration file and adds support for new blocks. The changes include:
- Importing the `AIMusicGeneratorBlock`, `JinaEmbeddingBlock`, and `UnrealTextToSpeechBlock` classes
- Updating the `BLOCK_COSTS` dictionary to include costs for the new blocks

These changes enable the usage of the newly introduced blocks.
2024-11-13 16:05:32 +00:00
Reinier van der Leer ef3f7aad18
fix(frontend): Unbreak credentials input on single-provider blocks (vol. 2)
Fix bad condition introduced in aaa0b79f (#8636) to resolve #8635
2024-11-13 15:30:36 +01:00
Reinier van der Leer aaa0b79f08
fix(frontend): Unbreak credentials input on single-provider blocks (#8636)
- Resolves #8635

- fix(frontend): Fix type mismatch of `CredentialsField` schema between frontend and backend
   - Fix usages of `credentialsSchema.credentials_provider`

- refactor(backend): Create `CredentialsFieldSchemaExtra` model in backend so it can be mirrored directly in frontend
   - Add check to enforce multi-provider `CredentialsField` always has `discriminator`

- dx: Add type checking shortcut `yarn type-check` / `npm run type-check` for frontend
2024-11-13 13:48:15 +00:00
Krzysztof Czerwinski e907ffda6e
feat(platform): Simplify Credentials UX (#8524)
- Change `provider` of default credentials to actual provider names (e.g. `anthropic`), remove `llm` provider
- Add `discriminator` and `discriminator_mapping` to `CredentialsField` that allows to filter credentials input to only allow  providers for matching models in `useCredentials` hook (thanks @ntindle for the idea!); e.g. user chooses `GPT4_TURBO` so then only OpenAI credentials are allowed
- Choose credentials automatically and hide credentials input on the node completely if there's only one possible option
- Move `getValue` and `parseKeys` to utils
- Add `ANTHROPIC`, `GROQ` and `OLLAMA` to providers in frontend `types.ts`
- Add `hidden` field to credentials that is used for default system keys to hide them in user profile
- Now `provider` field in `CredentialsField` can accept multiple providers as a list

-----------------
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2024-11-12 16:55:48 +01:00
Zamil Majdy ef7e50403e
refactor(backend): Centralize Block Cost into a Single File (#8623) 2024-11-12 14:09:59 +00:00
Zamil Majdy 1e872406ca
feat(platform): Introduced Agent Execution Block (#8533) 2024-11-12 13:03:15 +07:00
Abhimanyu Yadav 5ee909f687
fix: show error toast on Run failure when inputs or credentials are i… (#8391)
* fix: show error toast on Run failure when inputs or credentials are invalid

* Remove supabase folder from tracking

* revert supabase

* remove toast

---------

Co-authored-by: Avhimanyu <2023ebcs396@online.bits-pilani.ac.in>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-11 22:43:15 -06:00
Nicholas Tindle ff1fa2af2d
[Snyk] Security upgrade zipp from 3.15.0 to 3.19.1 (#8622)
fix: docs/requirements.txt to reduce vulnerabilities


The following vulnerabilities are fixed by pinning transitive dependencies:
- https://snyk.io/vuln/SNYK-PYTHON-ZIPP-7430899

Co-authored-by: snyk-bot <snyk-bot@snyk.io>
2024-11-11 18:22:03 -06:00
dependabot[bot] ee3252bdb1
build(deps): bump cookie from 0.7.0 to 1.0.1 in /autogpt_platform/frontend (#8616)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-12 00:15:45 +00:00
Nicholas Tindle 4a8f3dbbb1
[Snyk] Security upgrade urllib3 from 2.0.7 to 2.2.2 (#8620)
Co-authored-by: snyk-bot <snyk-bot@snyk.io>
2024-11-11 17:45:56 -06:00
dependabot[bot] 4a163e5b54
build(deps-dev): bump the development-dependencies group in /autogpt_platform/backend with 3 updates (#8611)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-11 17:31:13 -06:00
dependabot[bot] ca0b2311e8
build(deps): bump the production-dependencies group in /autogpt_platform/frontend with 8 updates (#8614)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-11 17:25:25 -06:00
dependabot[bot] d03fd930c6
build(deps-dev): bump the development-dependencies group in /autogpt_platform/frontend with 12 updates (#8615)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-11 22:34:43 +00:00
dependabot[bot] 603fec3467
build(deps): bump the production-dependencies group in /autogpt_platform/backend with 2 updates (#8610)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-11 22:21:44 +00:00
dependabot[bot] c53c7f8dd8
build(deps-dev): bump the development-dependencies group in /autogpt_platform/market with 2 updates (#8612)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-11 22:13:43 +00:00
dependabot[bot] ce3539ff16
build(deps-dev): bump ruff from 0.7.2 to 0.7.3 in /autogpt_platform/autogpt_libs in the development-dependencies group (#8618)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-11 16:10:08 -06:00
dependabot[bot] ea8f164b93
build(deps): bump supabase from 2.9.1 to 2.10.0 in /autogpt_platform/autogpt_libs in the production-dependencies group (#8617)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-11 21:48:57 +00:00
Nicholas Tindle 3c0dea0017
[Snyk] Security upgrade python from 3.11-slim-buster to 3.11.10-slim-bookworm (#8619)
fix: autogpt_platform/backend/Dockerfile to reduce vulnerabilities

The following vulnerabilities are fixed with an upgrade:
- https://snyk.io/vuln/SNYK-DEBIAN10-GNUTLS28-6159414
- https://snyk.io/vuln/SNYK-DEBIAN10-NCURSES-1655739
- https://snyk.io/vuln/SNYK-DEBIAN10-NCURSES-1655739
- https://snyk.io/vuln/SNYK-DEBIAN10-NCURSES-5421196
- https://snyk.io/vuln/SNYK-DEBIAN10-NCURSES-5421196

Co-authored-by: snyk-bot <snyk-bot@snyk.io>
2024-11-11 20:33:39 +00:00
dependabot[bot] 9e4246602d
build(deps): bump peter-evans/repository-dispatch from 2 to 3 (#8609)
Bumps [peter-evans/repository-dispatch](https://github.com/peter-evans/repository-dispatch) from 2 to 3.
- [Release notes](https://github.com/peter-evans/repository-dispatch/releases)
- [Commits](https://github.com/peter-evans/repository-dispatch/compare/v2...v3)

---
updated-dependencies:
- dependency-name: peter-evans/repository-dispatch
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-11 19:59:12 +00:00
Toran Bruce Richards 1f0cbc6500
feat(block): Add AI Music Generator Block with Meta Music Gen (#8532)
* feat(platform): Add AIMusicGeneratorBlock for music generation

* refactor(platform): Refactor AIMusicGeneratorBlock for improved error handling and logging

* refactor(ui): Refactor ContentRenderer to support audio rendering

* format

* Frontend format and lint
2024-11-11 09:36:39 -08:00
Aarushi e6e47373ac
feat(blocks/jina): Add default credentials for Jina (#8603)
add jina defaults
2024-11-10 17:18:53 +00:00
Aarushi f981a74a10
Merge master into dev 2024-11-09 19:10:03 -06:00
Aarushi 09dd391041
Merge dev into master 2024-11-09 19:09:09 -06:00
Nicholas Tindle 0b5b95eff5
build: add launch.json debugging for vscode (#8496)
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-08 20:49:13 +00:00
Aarushi 4adbbc52f2 Merge branch 'master' into dev 2024-11-08 11:52:14 -06:00
Aarushi 359ae8307a
feat(backend): Add API key DB table (#8593)
* add api key db tables

* remove uniqueness constraint on prefix

* add postfix
2024-11-08 17:48:37 +00:00
Aarushi f719c7e70e
feat(blocks): Pinecone blocks (#8535)
* update pinecone

* update blocks

* fix linting

* update test

* update requests

* mock funciton
2024-11-08 17:13:55 +00:00
Reinier van der Leer c960bd870c
dx: Clean up PR template (#8541)
* dx: Clean up PR template
2024-11-08 17:44:55 +01:00
Reinier van der Leer dfb7cf19f7
dx: Fix `pre-commit` config (#8584)
- fix naming of hooks
- fix `pyright` hooks (b0rked by repo restructure)
- fix `forge` path (b0rked by faulty replace-all when the repo was restructured)
- fix `black` hook to work on all Python versions
- add `poetry install` hooks
- add `ruff`, `isort`, `pyright`, `pytest`, and `prisma generate` hooks for `backend/`
- add `ruff` and `pyright` hooks for `autogpt_libs/`
2024-11-08 16:06:23 +00:00
Aarushi d6ecf80197 Merge branch 'dev' 2024-11-07 22:43:26 -06:00
Aarushi c0f77c8e7a
ci(platform): Update pipelines to run from infra repo (#8595)
* deploy trigger

* update name

* test path change

* update prod deploy too

* remove envvars

* update prod deploy name
2024-11-07 22:39:59 -06:00
Aarushi 47759f6951
refactor(backend): Remove config.default.json (#8581)
remove config default josn

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2024-11-07 21:20:15 -06:00
jackfromeast bcaf3241da
fix (backend): Patching the SSRF vulnerability in Github/Web Search/Request related blocks (#8531) 2024-11-08 00:29:18 +00:00
Aarushi c25d03e945
fix(searchtheweb): Fix the Jina Search Block (#8583)
* update jina search web block

* update to false
2024-11-07 17:26:17 -06:00
Zamil Majdy 91edf08540
feat(backend): Improve pyro reliability by adding connection timeout, retry, cleanup, and dynamic connection thread size (#8574) 2024-11-07 11:34:32 +07:00
Nicholas Tindle b08ad973fa
Update codeql.yml (#8500)
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-07 02:51:01 +00:00
Abhimanyu Yadav a037c431cd
feat(build-page): make all content unselectable except flowEditor (#8534)
* docs(platform): Update frontend instructions (#8514)

update readme

* docs(platform): correct readme

* feat(build-page): make all content unselectable except flowEditor

* revert some changes

* revert some changes

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2024-11-07 02:45:42 +00:00
Zamil Majdy 86c544177e
refactor(backend): Introduced Graph Input & Output Schema, Merge GraphMeta & Graph, Remove subgraph functionality (#8526) 2024-11-07 09:30:51 +07:00
Zamil Majdy af9ea5bc31
fix(frontend): Broken UI caused by lack of error propagation on Run (#8575) 2024-11-07 08:49:32 +07:00
Nicholas Tindle 25fa1bee1e
fix: update cookie (#8571) 2024-11-05 18:55:57 -06:00
Nicholas Tindle c76c077522
feat: bump elliptic to 6.6.0 (#8570) 2024-11-05 18:35:33 -06:00
dependabot[bot] 4259ad686e
build(deps): bump the production-dependencies group across 1 directory with 11 updates (#8567)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-05 17:32:15 -06:00
Aarushi 9a2664be35 fix(platform): Add local enc key (#8568)
add local enc key
2024-11-05 12:44:59 -08:00
Aarushi 9070378e60
fix(platform): Add local enc key (#8568)
add local enc key
2024-11-05 12:44:36 -08:00
dependabot[bot] f17c20ed91
build(deps): manual fix!! bump replicate from 0.34.1 to 1.0.3 in /autogpt_platform/backend (#8476)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2024-11-05 14:34:37 -06:00
Nicholas Tindle 799c6e550a
[Snyk] Security upgrade python from 3.11-slim-buster to 3.11.10-slim-bookworm (#8557)
Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: snyk-bot <snyk-bot@snyk.io>
2024-11-05 19:19:36 +00:00
Zamil Majdy 21100c109a
feat(backend): Reduce number of services on the local mode (#8563) 2024-11-05 22:36:51 +07:00
SwiftyOS b92c4774a6 feat(platform): Added .gitignore to platform root dir 2024-11-05 16:32:48 +01:00
Zamil Majdy 3a127dc355
feat(backend): Reduce number of services on the local mode (#8562) 2024-11-05 22:24:36 +07:00
SwiftyOS c19703150a fix(gitignore): Allow file extensions after .ign. or .ignore. 2024-11-05 16:09:19 +01:00
SwiftyOS 3e7d0e7f1b feat(backend): update .gitignore to include ignore / ign file extension 2024-11-05 16:07:13 +01:00
Aarushi 44f73078f7 Update docker-compose.platform.yml (#8560) 2024-11-04 21:38:05 -08:00
SwiftyOS db44d8c2ec fix(platform): Add ENCRYPTION_KEY Env Var to docker compose file 2024-11-04 10:06:43 -08:00
SwiftyOS 952f6f58ef docs(platform): correct readme 2024-11-01 09:13:55 +00:00
Swifty 151fad5ced docs(platform): Update frontend instructions (#8514)
update readme
2024-10-31 13:41:54 +00:00
632 changed files with 55673 additions and 20807 deletions

View File

@ -1,40 +1,61 @@
# Ignore everything by default, selectively add things to context
classic/run
*
# AutoGPT
# Platform - Libs
!autogpt_platform/autogpt_libs/autogpt_libs/
!autogpt_platform/autogpt_libs/pyproject.toml
!autogpt_platform/autogpt_libs/poetry.lock
!autogpt_platform/autogpt_libs/README.md
# Platform - Backend
!autogpt_platform/backend/backend/
!autogpt_platform/backend/migrations/
!autogpt_platform/backend/schema.prisma
!autogpt_platform/backend/pyproject.toml
!autogpt_platform/backend/poetry.lock
!autogpt_platform/backend/README.md
# Platform - Market
!autogpt_platform/market/market/
!autogpt_platform/market/scripts.py
!autogpt_platform/market/schema.prisma
!autogpt_platform/market/pyproject.toml
!autogpt_platform/market/poetry.lock
!autogpt_platform/market/README.md
# Platform - Frontend
!autogpt_platform/frontend/src/
!autogpt_platform/frontend/public/
!autogpt_platform/frontend/package.json
!autogpt_platform/frontend/yarn.lock
!autogpt_platform/frontend/tsconfig.json
!autogpt_platform/frontend/README.md
## config
!autogpt_platform/frontend/*.config.*
!autogpt_platform/frontend/.env.*
# Classic - AutoGPT
!classic/original_autogpt/autogpt/
!classic/original_autogpt/pyproject.toml
!classic/original_autogpt/poetry.lock
!classic/original_autogpt/README.md
!classic/original_autogpt/tests/
# Benchmark
# Classic - Benchmark
!classic/benchmark/agbenchmark/
!classic/benchmark/pyproject.toml
!classic/benchmark/poetry.lock
!classic/benchmark/README.md
# Forge
# Classic - Forge
!classic/forge/
!classic/forge/pyproject.toml
!classic/forge/poetry.lock
!classic/forge/README.md
# Frontend
# Classic - Frontend
!classic/frontend/build/web/
# Platform
!autogpt_platform/
# Explicitly re-ignore some folders
.*
**/__pycache__
autogpt_platform/frontend/.next/
autogpt_platform/frontend/node_modules
autogpt_platform/frontend/.env.example
autogpt_platform/frontend/.env.local
autogpt_platform/backend/.env
autogpt_platform/backend/.venv/
autogpt_platform/market/.env

View File

@ -1,36 +1,38 @@
### Background
<!-- Clearly explain the need for these changes: -->
### Changes 🏗️
<!-- Concisely describe all of the changes made in this pull request: -->
### Checklist 📋
### Testing 🔍
> [!NOTE]
Only for the new autogpt platform, currently in autogpt_platform/
#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
<!-- Put your test plan here: -->
- [ ] ...
<!--
Please make sure your changes have been tested and are in good working condition.
Here is a list of our critical paths, if you need some inspiration on what and how to test:
-->
<details>
<summary>Example test plan</summary>
- [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes correctly
- [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
- [ ] Edit an agent from monitor, and confirm it executes correctly
</details>
- Create from scratch and execute an agent with at least 3 blocks
- Import an agent from file upload, and confirm it executes correctly
- Upload agent to marketplace
- Import an agent from marketplace and confirm it executes correctly
- Edit an agent from monitor, and confirm it executes correctly
#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my changes
- [ ] I have included a list of my configuration changes in the PR description (under **Changes**)
### Configuration Changes 📝
> [!NOTE]
Only for the new autogpt platform, currently in autogpt_platform/
<details>
<summary>Examples of configuration changes</summary>
If you're making configuration or infrastructure changes, please remember to check you've updated the related infrastructure code in the autogpt_platform/infra folder.
Examples of such changes might include:
- Changing ports
- Adding new services that need to communicate with each other
- Secrets or environment variable changes
- New or infrastructure changes such as databases
- Changing ports
- Adding new services that need to communicate with each other
- Secrets or environment variable changes
- New or infrastructure changes such as databases
</details>

View File

@ -7,6 +7,9 @@ updates:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: "dev"
commit-message:
prefix: "chore(libs/deps)"
prefix-development: "chore(libs/deps-dev)"
groups:
production-dependencies:
dependency-type: "production"
@ -26,6 +29,9 @@ updates:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: "dev"
commit-message:
prefix: "chore(backend/deps)"
prefix-development: "chore(backend/deps-dev)"
groups:
production-dependencies:
dependency-type: "production"
@ -38,7 +44,6 @@ updates:
- "minor"
- "patch"
# frontend (Next.js project)
- package-ecosystem: "npm"
directory: "autogpt_platform/frontend"
@ -46,6 +51,9 @@ updates:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: "dev"
commit-message:
prefix: "chore(frontend/deps)"
prefix-development: "chore(frontend/deps-dev)"
groups:
production-dependencies:
dependency-type: "production"
@ -58,7 +66,6 @@ updates:
- "minor"
- "patch"
# infra (Terraform)
- package-ecosystem: "terraform"
directory: "autogpt_platform/infra"
@ -66,26 +73,10 @@ updates:
interval: "weekly"
open-pull-requests-limit: 5
target-branch: "dev"
groups:
production-dependencies:
dependency-type: "production"
update-types:
- "minor"
- "patch"
development-dependencies:
dependency-type: "development"
update-types:
- "minor"
- "patch"
commit-message:
prefix: "chore(infra/deps)"
prefix-development: "chore(infra/deps-dev)"
# market (Poetry project)
- package-ecosystem: "pip"
directory: "autogpt_platform/market"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: "dev"
groups:
production-dependencies:
dependency-type: "production"
@ -146,6 +137,9 @@ updates:
interval: "weekly"
open-pull-requests-limit: 1
target-branch: "dev"
commit-message:
prefix: "chore(platform/deps)"
prefix-development: "chore(platform/deps-dev)"
groups:
production-dependencies:
dependency-type: "production"
@ -166,6 +160,8 @@ updates:
interval: "weekly"
open-pull-requests-limit: 1
target-branch: "dev"
commit-message:
prefix: "chore(docs/deps)"
groups:
production-dependencies:
dependency-type: "production"

View File

@ -5,7 +5,7 @@ on:
- cron: 20 4 * * 1,4
env:
BASE_BRANCH: development
BASE_BRANCH: dev
IMAGE_NAME: auto-gpt
jobs:
@ -15,46 +15,46 @@ jobs:
matrix:
build-type: [release, dev]
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- id: build
name: Build image
uses: docker/build-push-action@v5
with:
context: classic/
file: classic/Dockerfile.autogpt
build-args: BUILD_TYPE=${{ matrix.build-type }}
load: true # save to docker images
# use GHA cache as read-only
cache-to: type=gha,scope=autogpt-docker-${{ matrix.build-type }},mode=max
- id: build
name: Build image
uses: docker/build-push-action@v6
with:
context: classic/
file: classic/Dockerfile.autogpt
build-args: BUILD_TYPE=${{ matrix.build-type }}
load: true # save to docker images
# use GHA cache as read-only
cache-to: type=gha,scope=autogpt-docker-${{ matrix.build-type }},mode=max
- name: Generate build report
env:
event_name: ${{ github.event_name }}
event_ref: ${{ github.event.schedule }}
- name: Generate build report
env:
event_name: ${{ github.event_name }}
event_ref: ${{ github.event.schedule }}
build_type: ${{ matrix.build-type }}
build_type: ${{ matrix.build-type }}
prod_branch: master
dev_branch: development
repository: ${{ github.repository }}
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'development' && 'development' || 'master' }}
prod_branch: master
dev_branch: dev
repository: ${{ github.repository }}
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'dev' && 'dev' || 'master' }}
current_ref: ${{ github.ref_name }}
commit_hash: ${{ github.sha }}
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.sha) }}
push_forced_label:
current_ref: ${{ github.ref_name }}
commit_hash: ${{ github.sha }}
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.sha) }}
push_forced_label:
new_commits_json: ${{ null }}
compare_url_template: ${{ format('/{0}/compare/{{base}}...{{head}}', github.repository) }}
new_commits_json: ${{ null }}
compare_url_template: ${{ format('/{0}/compare/{{base}}...{{head}}', github.repository) }}
github_context_json: ${{ toJSON(github) }}
job_env_json: ${{ toJSON(env) }}
vars_json: ${{ toJSON(vars) }}
github_context_json: ${{ toJSON(github) }}
job_env_json: ${{ toJSON(env) }}
vars_json: ${{ toJSON(vars) }}
run: .github/workflows/scripts/docker-ci-summary.sh >> $GITHUB_STEP_SUMMARY
continue-on-error: true
run: .github/workflows/scripts/docker-ci-summary.sh >> $GITHUB_STEP_SUMMARY
continue-on-error: true

View File

@ -2,7 +2,7 @@ name: Classic - AutoGPT Docker CI
on:
push:
branches: [ master, development ]
branches: [master, dev]
paths:
- '.github/workflows/classic-autogpt-docker-ci.yml'
- 'classic/original_autogpt/**'
@ -34,58 +34,58 @@ jobs:
matrix:
build-type: [release, dev]
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- if: runner.debug
run: |
ls -al
du -hs *
- if: runner.debug
run: |
ls -al
du -hs *
- id: build
name: Build image
uses: docker/build-push-action@v5
with:
context: classic/
file: classic/Dockerfile.autogpt
build-args: BUILD_TYPE=${{ matrix.build-type }}
tags: ${{ env.IMAGE_NAME }}
labels: GIT_REVISION=${{ github.sha }}
load: true # save to docker images
# cache layers in GitHub Actions cache to speed up builds
cache-from: type=gha,scope=autogpt-docker-${{ matrix.build-type }}
cache-to: type=gha,scope=autogpt-docker-${{ matrix.build-type }},mode=max
- id: build
name: Build image
uses: docker/build-push-action@v6
with:
context: classic/
file: classic/Dockerfile.autogpt
build-args: BUILD_TYPE=${{ matrix.build-type }}
tags: ${{ env.IMAGE_NAME }}
labels: GIT_REVISION=${{ github.sha }}
load: true # save to docker images
# cache layers in GitHub Actions cache to speed up builds
cache-from: type=gha,scope=autogpt-docker-${{ matrix.build-type }}
cache-to: type=gha,scope=autogpt-docker-${{ matrix.build-type }},mode=max
- name: Generate build report
env:
event_name: ${{ github.event_name }}
event_ref: ${{ github.event.ref }}
event_ref_type: ${{ github.event.ref}}
- name: Generate build report
env:
event_name: ${{ github.event_name }}
event_ref: ${{ github.event.ref }}
event_ref_type: ${{ github.event.ref}}
build_type: ${{ matrix.build-type }}
build_type: ${{ matrix.build-type }}
prod_branch: master
dev_branch: development
repository: ${{ github.repository }}
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'development' && 'development' || 'master' }}
prod_branch: master
dev_branch: dev
repository: ${{ github.repository }}
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'dev' && 'dev' || 'master' }}
current_ref: ${{ github.ref_name }}
commit_hash: ${{ github.event.after }}
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.event.release && github.event.release.tag_name || github.sha) }}
push_forced_label: ${{ github.event.forced && '☢️ forced' || '' }}
current_ref: ${{ github.ref_name }}
commit_hash: ${{ github.event.after }}
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.event.release && github.event.release.tag_name || github.sha) }}
push_forced_label: ${{ github.event.forced && '☢️ forced' || '' }}
new_commits_json: ${{ toJSON(github.event.commits) }}
compare_url_template: ${{ format('/{0}/compare/{{base}}...{{head}}', github.repository) }}
new_commits_json: ${{ toJSON(github.event.commits) }}
compare_url_template: ${{ format('/{0}/compare/{{base}}...{{head}}', github.repository) }}
github_context_json: ${{ toJSON(github) }}
job_env_json: ${{ toJSON(env) }}
vars_json: ${{ toJSON(vars) }}
github_context_json: ${{ toJSON(github) }}
job_env_json: ${{ toJSON(env) }}
vars_json: ${{ toJSON(vars) }}
run: .github/workflows/scripts/docker-ci-summary.sh >> $GITHUB_STEP_SUMMARY
continue-on-error: true
run: .github/workflows/scripts/docker-ci-summary.sh >> $GITHUB_STEP_SUMMARY
continue-on-error: true
test:
runs-on: ubuntu-latest
@ -117,16 +117,16 @@ jobs:
- id: build
name: Build image
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
context: classic/
file: classic/Dockerfile.autogpt
build-args: BUILD_TYPE=dev # include pytest
build-args: BUILD_TYPE=dev # include pytest
tags: >
${{ env.IMAGE_NAME }},
${{ env.DEPLOY_IMAGE_NAME }}:${{ env.DEV_IMAGE_TAG }}
labels: GIT_REVISION=${{ github.sha }}
load: true # save to docker images
load: true # save to docker images
# cache layers in GitHub Actions cache to speed up builds
cache-from: type=gha,scope=autogpt-docker-dev
cache-to: type=gha,scope=autogpt-docker-dev,mode=max

View File

@ -2,7 +2,7 @@ name: Classic - AutoGPT Docker Release
on:
release:
types: [ published, edited ]
types: [published, edited]
workflow_dispatch:
inputs:
@ -19,69 +19,69 @@ jobs:
if: startsWith(github.ref, 'refs/tags/autogpt-')
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Checkout repository
uses: actions/checkout@v4
- name: Log in to Docker hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Log in to Docker hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
# slashes are not allowed in image tags, but can appear in git branch or tag names
- id: sanitize_tag
name: Sanitize image tag
run: |
tag=${raw_tag//\//-}
echo tag=${tag#autogpt-} >> $GITHUB_OUTPUT
env:
raw_tag: ${{ github.ref_name }}
# slashes are not allowed in image tags, but can appear in git branch or tag names
- id: sanitize_tag
name: Sanitize image tag
run: |
tag=${raw_tag//\//-}
echo tag=${tag#autogpt-} >> $GITHUB_OUTPUT
env:
raw_tag: ${{ github.ref_name }}
- id: build
name: Build image
uses: docker/build-push-action@v5
with:
context: classic/
file: Dockerfile.autogpt
build-args: BUILD_TYPE=release
load: true # save to docker images
# push: true # TODO: uncomment when this issue is fixed: https://github.com/moby/buildkit/issues/1555
tags: >
${{ env.IMAGE_NAME }},
${{ env.DEPLOY_IMAGE_NAME }}:latest,
${{ env.DEPLOY_IMAGE_NAME }}:${{ steps.sanitize_tag.outputs.tag }}
labels: GIT_REVISION=${{ github.sha }}
- id: build
name: Build image
uses: docker/build-push-action@v6
with:
context: classic/
file: Dockerfile.autogpt
build-args: BUILD_TYPE=release
load: true # save to docker images
# push: true # TODO: uncomment when this issue is fixed: https://github.com/moby/buildkit/issues/1555
tags: >
${{ env.IMAGE_NAME }},
${{ env.DEPLOY_IMAGE_NAME }}:latest,
${{ env.DEPLOY_IMAGE_NAME }}:${{ steps.sanitize_tag.outputs.tag }}
labels: GIT_REVISION=${{ github.sha }}
# cache layers in GitHub Actions cache to speed up builds
cache-from: ${{ !inputs.no_cache && 'type=gha' || '' }},scope=autogpt-docker-release
cache-to: type=gha,scope=autogpt-docker-release,mode=max
# cache layers in GitHub Actions cache to speed up builds
cache-from: ${{ !inputs.no_cache && 'type=gha' || '' }},scope=autogpt-docker-release
cache-to: type=gha,scope=autogpt-docker-release,mode=max
- name: Push image to Docker Hub
run: docker push --all-tags ${{ env.DEPLOY_IMAGE_NAME }}
- name: Push image to Docker Hub
run: docker push --all-tags ${{ env.DEPLOY_IMAGE_NAME }}
- name: Generate build report
env:
event_name: ${{ github.event_name }}
event_ref: ${{ github.event.ref }}
event_ref_type: ${{ github.event.ref}}
inputs_no_cache: ${{ inputs.no_cache }}
- name: Generate build report
env:
event_name: ${{ github.event_name }}
event_ref: ${{ github.event.ref }}
event_ref_type: ${{ github.event.ref}}
inputs_no_cache: ${{ inputs.no_cache }}
prod_branch: master
dev_branch: development
repository: ${{ github.repository }}
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'development' && 'development' || 'master' }}
prod_branch: master
dev_branch: dev
repository: ${{ github.repository }}
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'dev' && 'dev' || 'master' }}
ref_type: ${{ github.ref_type }}
current_ref: ${{ github.ref_name }}
commit_hash: ${{ github.sha }}
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.event.release && github.event.release.tag_name || github.sha) }}
ref_type: ${{ github.ref_type }}
current_ref: ${{ github.ref_name }}
commit_hash: ${{ github.sha }}
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.event.release && github.event.release.tag_name || github.sha) }}
github_context_json: ${{ toJSON(github) }}
job_env_json: ${{ toJSON(env) }}
vars_json: ${{ toJSON(vars) }}
github_context_json: ${{ toJSON(github) }}
job_env_json: ${{ toJSON(env) }}
vars_json: ${{ toJSON(vars) }}
run: .github/workflows/scripts/docker-release-summary.sh >> $GITHUB_STEP_SUMMARY
continue-on-error: true
run: .github/workflows/scripts/docker-release-summary.sh >> $GITHUB_STEP_SUMMARY
continue-on-error: true

View File

@ -102,7 +102,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
agent-name: [ forge ]
agent-name: [forge]
fail-fast: false
timeout-minutes: 20
steps:
@ -146,23 +146,23 @@ jobs:
echo "Running the following command: poetry run agbenchmark --mock --category=coding"
poetry run agbenchmark --mock --category=coding
echo "Running the following command: poetry run agbenchmark --test=WriteFile"
poetry run agbenchmark --test=WriteFile
# echo "Running the following command: poetry run agbenchmark --test=WriteFile"
# poetry run agbenchmark --test=WriteFile
cd ../benchmark
poetry install
echo "Adding the BUILD_SKILL_TREE environment variable. This will attempt to add new elements in the skill tree. If new elements are added, the CI fails because they should have been pushed"
export BUILD_SKILL_TREE=true
poetry run agbenchmark --mock
# poetry run agbenchmark --mock
CHANGED=$(git diff --name-only | grep -E '(agbenchmark/challenges)|(../classic/frontend/assets)') || echo "No diffs"
if [ ! -z "$CHANGED" ]; then
echo "There are unstaged changes please run agbenchmark and commit those changes since they are needed."
echo "$CHANGED"
exit 1
else
echo "No unstaged changes."
fi
# CHANGED=$(git diff --name-only | grep -E '(agbenchmark/challenges)|(../classic/frontend/assets)') || echo "No diffs"
# if [ ! -z "$CHANGED" ]; then
# echo "There are unstaged changes please run agbenchmark and commit those changes since they are needed."
# echo "$CHANGED"
# exit 1
# else
# echo "No unstaged changes."
# fi
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
TELEMETRY_ENVIRONMENT: autogpt-benchmark-ci

View File

@ -4,7 +4,7 @@ on:
push:
branches:
- master
- development
- dev
- 'ci-test*' # This will match any branch that starts with "ci-test"
paths:
- 'classic/frontend/**'
@ -24,37 +24,37 @@ jobs:
BUILD_BRANCH: ${{ format('classic-frontend-build/{0}', github.ref_name) }}
steps:
- name: Checkout Repo
uses: actions/checkout@v4
- name: Checkout Repo
uses: actions/checkout@v4
- name: Setup Flutter
uses: subosito/flutter-action@v2
with:
flutter-version: '3.13.2'
- name: Setup Flutter
uses: subosito/flutter-action@v2
with:
flutter-version: '3.13.2'
- name: Build Flutter to Web
run: |
cd classic/frontend
flutter build web --base-href /app/
- name: Build Flutter to Web
run: |
cd classic/frontend
flutter build web --base-href /app/
# - name: Commit and Push to ${{ env.BUILD_BRANCH }}
# if: github.event_name == 'push'
# run: |
# git config --local user.email "action@github.com"
# git config --local user.name "GitHub Action"
# git add classic/frontend/build/web
# git checkout -B ${{ env.BUILD_BRANCH }}
# git commit -m "Update frontend build to ${GITHUB_SHA:0:7}" -a
# git push -f origin ${{ env.BUILD_BRANCH }}
# - name: Commit and Push to ${{ env.BUILD_BRANCH }}
# if: github.event_name == 'push'
# run: |
# git config --local user.email "action@github.com"
# git config --local user.name "GitHub Action"
# git add classic/frontend/build/web
# git checkout -B ${{ env.BUILD_BRANCH }}
# git commit -m "Update frontend build to ${GITHUB_SHA:0:7}" -a
# git push -f origin ${{ env.BUILD_BRANCH }}
- name: Create PR ${{ env.BUILD_BRANCH }} -> ${{ github.ref_name }}
if: github.event_name == 'push'
uses: peter-evans/create-pull-request@v7
with:
add-paths: classic/frontend/build/web
base: ${{ github.ref_name }}
branch: ${{ env.BUILD_BRANCH }}
delete-branch: true
title: "Update frontend build in `${{ github.ref_name }}`"
body: "This PR updates the frontend build based on commit ${{ github.sha }}."
commit-message: "Update frontend build based on commit ${{ github.sha }}"
- name: Create PR ${{ env.BUILD_BRANCH }} -> ${{ github.ref_name }}
if: github.event_name == 'push'
uses: peter-evans/create-pull-request@v7
with:
add-paths: classic/frontend/build/web
base: ${{ github.ref_name }}
branch: ${{ env.BUILD_BRANCH }}
delete-branch: true
title: "Update frontend build in `${{ github.ref_name }}`"
body: "This PR updates the frontend build based on commit ${{ github.sha }}."
commit-message: "Update frontend build based on commit ${{ github.sha }}"

View File

@ -13,9 +13,10 @@ name: "CodeQL"
on:
push:
branches: [ "master", "release-*" ]
branches: [ "master", "release-*", "dev" ]
pull_request:
branches: [ "master", "release-*" ]
branches: [ "master", "release-*", "dev" ]
merge_group:
schedule:
- cron: '15 4 * * 0'

View File

@ -1,4 +1,4 @@
name: AutoGPT Platform - Build, Push, and Deploy Prod Environment
name: AutoGPT Platform - Deploy Prod Environment
on:
release:
@ -8,12 +8,6 @@ permissions:
contents: 'read'
id-token: 'write'
env:
PROJECT_ID: ${{ secrets.GCP_PROJECT_ID }}
GKE_CLUSTER: prod-gke-cluster
GKE_ZONE: us-central1-a
NAMESPACE: prod-agpt
jobs:
migrate:
environment: production
@ -41,142 +35,15 @@ jobs:
env:
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
- name: Run Market Migrations
working-directory: ./autogpt_platform/market
run: |
python -m prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.MARKET_DATABASE_URL }}
build-push-deploy:
environment: production
name: Build, Push, and Deploy
trigger:
needs: migrate
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- id: 'auth'
uses: 'google-github-actions/auth@v2'
with:
workload_identity_provider: 'projects/1021527134101/locations/global/workloadIdentityPools/prod-pool/providers/github'
service_account: 'prod-github-actions-sa@agpt-prod.iam.gserviceaccount.com'
token_format: 'access_token'
create_credentials_file: true
- name: 'Set up Cloud SDK'
uses: 'google-github-actions/setup-gcloud@v2'
- name: 'Configure Docker'
run: |
gcloud auth configure-docker us-east1-docker.pkg.dev
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Cache Docker layers
uses: actions/cache@v4
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Check for changes
id: check_changes
run: |
git fetch origin master
BACKEND_CHANGED=$(git diff --name-only origin/master HEAD | grep "^autogpt_platform/backend/" && echo "true" || echo "false")
FRONTEND_CHANGED=$(git diff --name-only origin/master HEAD | grep "^autogpt_platform/frontend/" && echo "true" || echo "false")
MARKET_CHANGED=$(git diff --name-only origin/master HEAD | grep "^autogpt_platform/market/" && echo "true" || echo "false")
echo "backend_changed=$BACKEND_CHANGED" >> $GITHUB_OUTPUT
echo "frontend_changed=$FRONTEND_CHANGED" >> $GITHUB_OUTPUT
echo "market_changed=$MARKET_CHANGED" >> $GITHUB_OUTPUT
- name: Get GKE credentials
uses: 'google-github-actions/get-gke-credentials@v2'
with:
cluster_name: ${{ env.GKE_CLUSTER }}
location: ${{ env.GKE_ZONE }}
- name: Build and Push Backend
if: steps.check_changes.outputs.backend_changed == 'true'
uses: docker/build-push-action@v2
with:
context: .
file: ./autogpt_platform/backend/Dockerfile
push: true
tags: us-east1-docker.pkg.dev/agpt-prod/agpt-backend-prod/agpt-backend-prod:${{ github.sha }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Build and Push Frontend
if: steps.check_changes.outputs.frontend_changed == 'true'
uses: docker/build-push-action@v2
with:
context: .
file: ./autogpt_platform/frontend/Dockerfile
push: true
tags: us-east1-docker.pkg.dev/agpt-prod/agpt-frontend-prod/agpt-frontend-prod:${{ github.sha }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Build and Push Market
if: steps.check_changes.outputs.market_changed == 'true'
uses: docker/build-push-action@v2
with:
context: .
file: ./autogpt_platform/market/Dockerfile
push: true
tags: us-east1-docker.pkg.dev/agpt-prod/agpt-market-prod/agpt-market-prod:${{ github.sha }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Move cache
run: |
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
- name: Set up Helm
uses: azure/setup-helm@v4
with:
version: v3.4.0
- name: Deploy Backend
if: steps.check_changes.outputs.backend_changed == 'true'
run: |
helm upgrade autogpt-server ./autogpt-server \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-server/values.yaml \
-f autogpt-server/values.prod.yaml \
--set image.tag=${{ github.sha }}
- name: Deploy Websocket
if: steps.check_changes.outputs.backend_changed == 'true'
run: |
helm upgrade autogpt-websocket-server ./autogpt-websocket-server \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-websocket-server/values.yaml \
-f autogpt-websocket-server/values.prod.yaml \
--set image.tag=${{ github.sha }}
- name: Deploy Market
if: steps.check_changes.outputs.market_changed == 'true'
run: |
helm upgrade autogpt-market ./autogpt-market \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-market/values.yaml \
-f autogpt-market/values.prod.yaml \
--set image.tag=${{ github.sha }}
- name: Deploy Frontend
if: steps.check_changes.outputs.frontend_changed == 'true'
run: |
helm upgrade autogpt-builder ./autogpt-builder \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-builder/values.yaml \
-f autogpt-builder/values.prod.yaml \
--set image.tag=${{ github.sha }}
- name: Trigger deploy workflow
uses: peter-evans/repository-dispatch@v3
with:
token: ${{ secrets.DEPLOY_TOKEN }}
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
event-type: build_deploy_prod
client-payload: '{"ref": "${{ github.ref }}", "sha": "${{ github.sha }}", "repository": "${{ github.repository }}"}'

View File

@ -0,0 +1,50 @@
name: AutoGPT Platform - Deploy Dev Environment
on:
push:
branches: [ dev ]
paths:
- 'autogpt_platform/**'
permissions:
contents: 'read'
id-token: 'write'
jobs:
migrate:
environment: develop
name: Run migrations for AutoGPT Platform
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install prisma
- name: Run Backend Migrations
working-directory: ./autogpt_platform/backend
run: |
python -m prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
trigger:
needs: migrate
runs-on: ubuntu-latest
steps:
- name: Trigger deploy workflow
uses: peter-evans/repository-dispatch@v3
with:
token: ${{ secrets.DEPLOY_TOKEN }}
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
event-type: build_deploy_dev
client-payload: '{"ref": "${{ github.ref }}", "sha": "${{ github.sha }}", "repository": "${{ github.repository }}"}'

View File

@ -1,186 +0,0 @@
name: AutoGPT Platform - Build, Push, and Deploy Dev Environment
on:
push:
branches: [ dev ]
paths:
- 'autogpt_platform/backend/**'
- 'autogpt_platform/frontend/**'
- 'autogpt_platform/market/**'
permissions:
contents: 'read'
id-token: 'write'
env:
PROJECT_ID: ${{ secrets.GCP_PROJECT_ID }}
GKE_CLUSTER: dev-gke-cluster
GKE_ZONE: us-central1-a
NAMESPACE: dev-agpt
jobs:
migrate:
environment: develop
name: Run migrations for AutoGPT Platform
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install prisma
- name: Run Backend Migrations
working-directory: ./autogpt_platform/backend
run: |
python -m prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
- name: Run Market Migrations
working-directory: ./autogpt_platform/market
run: |
python -m prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.MARKET_DATABASE_URL }}
build-push-deploy:
name: Build, Push, and Deploy
needs: migrate
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- id: 'auth'
uses: 'google-github-actions/auth@v2'
with:
workload_identity_provider: 'projects/638488734936/locations/global/workloadIdentityPools/dev-pool/providers/github'
service_account: 'dev-github-actions-sa@agpt-dev.iam.gserviceaccount.com'
token_format: 'access_token'
create_credentials_file: true
- name: 'Set up Cloud SDK'
uses: 'google-github-actions/setup-gcloud@v2'
- name: 'Configure Docker'
run: |
gcloud auth configure-docker us-east1-docker.pkg.dev
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Cache Docker layers
uses: actions/cache@v4
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Check for changes
id: check_changes
run: |
git fetch origin dev
BACKEND_CHANGED=$(git diff --name-only origin/dev HEAD | grep "^autogpt_platform/backend/" && echo "true" || echo "false")
FRONTEND_CHANGED=$(git diff --name-only origin/dev HEAD | grep "^autogpt_platform/frontend/" && echo "true" || echo "false")
MARKET_CHANGED=$(git diff --name-only origin/dev HEAD | grep "^autogpt_platform/market/" && echo "true" || echo "false")
echo "backend_changed=$BACKEND_CHANGED" >> $GITHUB_OUTPUT
echo "frontend_changed=$FRONTEND_CHANGED" >> $GITHUB_OUTPUT
echo "market_changed=$MARKET_CHANGED" >> $GITHUB_OUTPUT
- name: Get GKE credentials
uses: 'google-github-actions/get-gke-credentials@v2'
with:
cluster_name: ${{ env.GKE_CLUSTER }}
location: ${{ env.GKE_ZONE }}
- name: Build and Push Backend
if: steps.check_changes.outputs.backend_changed == 'true'
uses: docker/build-push-action@v2
with:
context: .
file: ./autogpt_platform/backend/Dockerfile
push: true
tags: us-east1-docker.pkg.dev/agpt-dev/agpt-backend-dev/agpt-backend-dev:${{ github.sha }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Build and Push Frontend
if: steps.check_changes.outputs.frontend_changed == 'true'
uses: docker/build-push-action@v2
with:
context: .
file: ./autogpt_platform/frontend/Dockerfile
push: true
tags: us-east1-docker.pkg.dev/agpt-dev/agpt-frontend-dev/agpt-frontend-dev:${{ github.sha }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Build and Push Market
if: steps.check_changes.outputs.market_changed == 'true'
uses: docker/build-push-action@v2
with:
context: .
file: ./autogpt_platform/market/Dockerfile
push: true
tags: us-east1-docker.pkg.dev/agpt-dev/agpt-market-dev/agpt-market-dev:${{ github.sha }}
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Move cache
run: |
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
- name: Set up Helm
uses: azure/setup-helm@v4
with:
version: v3.4.0
- name: Deploy Backend
if: steps.check_changes.outputs.backend_changed == 'true'
run: |
helm upgrade autogpt-server ./autogpt-server \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-server/values.yaml \
-f autogpt-server/values.dev.yaml \
--set image.tag=${{ github.sha }}
- name: Deploy Websocket
if: steps.check_changes.outputs.backend_changed == 'true'
run: |
helm upgrade autogpt-websocket-server ./autogpt-websocket-server \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-websocket-server/values.yaml \
-f autogpt-websocket-server/values.dev.yaml \
--set image.tag=${{ github.sha }}
- name: Deploy Market
if: steps.check_changes.outputs.market_changed == 'true'
run: |
helm upgrade autogpt-market ./autogpt-market \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-market/values.yaml \
-f autogpt-market/values.dev.yaml \
--set image.tag=${{ github.sha }}
- name: Deploy Frontend
if: steps.check_changes.outputs.frontend_changed == 'true'
run: |
helm upgrade autogpt-builder ./autogpt-builder \
--namespace ${{ env.NAMESPACE }} \
-f autogpt-builder/values.yaml \
-f autogpt-builder/values.dev.yaml \
--set image.tag=${{ github.sha }}

View File

@ -1,56 +0,0 @@
name: AutoGPT Platform - Infra
on:
push:
branches: [ master, dev ]
paths:
- '.github/workflows/platform-autogpt-infra-ci.yml'
- 'autogpt_platform/infra/**'
pull_request:
paths:
- '.github/workflows/platform-autogpt-infra-ci.yml'
- 'autogpt_platform/infra/**'
defaults:
run:
shell: bash
working-directory: autogpt_platform/infra
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: TFLint
uses: pauloconnor/tflint-action@v0.0.2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tflint_path: terraform/
tflint_recurse: true
tflint_changed_only: false
- name: Set up Helm
uses: azure/setup-helm@v4
with:
version: v3.14.4
- name: Set up chart-testing
uses: helm/chart-testing-action@v2.6.1
- name: Run chart-testing (list-changed)
id: list-changed
run: |
changed=$(ct list-changed --target-branch ${{ github.event.repository.default_branch }})
if [[ -n "$changed" ]]; then
echo "changed=true" >> "$GITHUB_OUTPUT"
fi
- name: Run chart-testing (lint)
if: steps.list-changed.outputs.changed == 'true'
run: ct lint --target-branch ${{ github.event.repository.default_branch }}

View File

@ -6,11 +6,14 @@ on:
paths:
- ".github/workflows/platform-backend-ci.yml"
- "autogpt_platform/backend/**"
- "autogpt_platform/autogpt_libs/**"
pull_request:
branches: [master, dev, release-*]
paths:
- ".github/workflows/platform-backend-ci.yml"
- "autogpt_platform/backend/**"
- "autogpt_platform/autogpt_libs/**"
merge_group:
concurrency:
group: ${{ format('backend-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
@ -76,6 +79,17 @@ jobs:
echo "$HOME/.local/bin" >> $GITHUB_PATH
fi
- name: Check poetry.lock
run: |
poetry lock
if ! git diff --quiet poetry.lock; then
echo "Error: poetry.lock not up to date."
echo
git diff poetry.lock
exit 1
fi
- name: Install Python dependencies
run: poetry install

View File

@ -10,6 +10,7 @@ on:
paths:
- ".github/workflows/platform-frontend-ci.yml"
- "autogpt_platform/frontend/**"
merge_group:
defaults:
run:
@ -22,6 +23,7 @@ jobs:
steps:
- uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
@ -37,24 +39,12 @@ jobs:
test:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
browser: [chromium, webkit]
steps:
- name: Free Disk Space (Ubuntu)
uses: jlumbroso/free-disk-space@main
with:
# this might remove tools that are actually needed,
# if set to "true" but frees about 6 GB
tool-cache: false
# all of these default to true, but feel free to set to
# "false" if necessary for your workflow
android: false
dotnet: false
haskell: false
large-packages: true
docker-images: true
swap-storage: true
- name: Checkout repository
uses: actions/checkout@v4
with:
@ -65,10 +55,20 @@ jobs:
with:
node-version: "21"
- name: Free Disk Space (Ubuntu)
uses: jlumbroso/free-disk-space@main
with:
large-packages: false # slow
docker-images: false # limited benefit
- name: Copy default supabase .env
run: |
cp ../supabase/docker/.env.example ../.env
- name: Copy backend .env
run: |
cp ../backend/.env.example ../backend/.env
- name: Run docker compose
run: |
docker compose -f ../docker-compose.yml up -d
@ -81,16 +81,21 @@ jobs:
run: |
cp .env.example .env
- name: Install Playwright Browsers
run: yarn playwright install --with-deps
- name: Install Browser '${{ matrix.browser }}'
run: yarn playwright install --with-deps ${{ matrix.browser }}
- name: Run tests
run: |
yarn test
yarn test --project=${{ matrix.browser }}
- name: Print Docker Compose logs in debug mode
if: runner.debug
run: |
docker compose -f ../docker-compose.yml logs
- uses: actions/upload-artifact@v4
if: ${{ !cancelled() }}
with:
name: playwright-report
name: playwright-report-${{ matrix.browser }}
path: playwright-report/
retention-days: 30

View File

@ -1,125 +0,0 @@
name: AutoGPT Platform - Backend CI
on:
push:
branches: [master, dev, ci-test*]
paths:
- ".github/workflows/platform-market-ci.yml"
- "autogpt_platform/market/**"
pull_request:
branches: [master, dev, release-*]
paths:
- ".github/workflows/platform-market-ci.yml"
- "autogpt_platform/market/**"
concurrency:
group: ${{ format('backend-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
cancel-in-progress: ${{ startsWith(github.event_name, 'pull_request') }}
defaults:
run:
shell: bash
working-directory: autogpt_platform/market
jobs:
test:
permissions:
contents: read
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
python-version: ["3.10"]
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0
submodules: true
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Setup Supabase
uses: supabase/setup-cli@v1
with:
version: latest
- id: get_date
name: Get date
run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_OUTPUT
- name: Set up Python dependency cache
uses: actions/cache@v4
with:
path: ~/.cache/pypoetry
key: poetry-${{ runner.os }}-${{ hashFiles('autogpt_platform/market/poetry.lock') }}
- name: Install Poetry (Unix)
run: |
curl -sSL https://install.python-poetry.org | python3 -
if [ "${{ runner.os }}" = "macOS" ]; then
PATH="$HOME/.local/bin:$PATH"
echo "$HOME/.local/bin" >> $GITHUB_PATH
fi
- name: Install Python dependencies
run: poetry install
- name: Generate Prisma Client
run: poetry run prisma generate
- id: supabase
name: Start Supabase
working-directory: .
run: |
supabase init
supabase start --exclude postgres-meta,realtime,storage-api,imgproxy,inbucket,studio,edge-runtime,logflare,vector,supavisor
supabase status -o env | sed 's/="/=/; s/"$//' >> $GITHUB_OUTPUT
# outputs:
# DB_URL, API_URL, GRAPHQL_URL, ANON_KEY, SERVICE_ROLE_KEY, JWT_SECRET
- name: Run Database Migrations
run: poetry run prisma migrate dev --name updates
env:
DATABASE_URL: ${{ steps.supabase.outputs.DB_URL }}
- id: lint
name: Run Linter
run: poetry run lint
# Tests comment out because they do not work with prisma mock, nor have they been updated since they were created
# - name: Run pytest with coverage
# run: |
# if [[ "${{ runner.debug }}" == "1" ]]; then
# poetry run pytest -s -vv -o log_cli=true -o log_cli_level=DEBUG test
# else
# poetry run pytest -s -vv test
# fi
# if: success() || (failure() && steps.lint.outcome == 'failure')
# env:
# LOG_LEVEL: ${{ runner.debug && 'DEBUG' || 'INFO' }}
# DATABASE_URL: ${{ steps.supabase.outputs.DB_URL }}
# SUPABASE_URL: ${{ steps.supabase.outputs.API_URL }}
# SUPABASE_SERVICE_ROLE_KEY: ${{ steps.supabase.outputs.SERVICE_ROLE_KEY }}
# SUPABASE_JWT_SECRET: ${{ steps.supabase.outputs.JWT_SECRET }}
# REDIS_HOST: 'localhost'
# REDIS_PORT: '6379'
# REDIS_PASSWORD: 'testpassword'
env:
CI: true
PLAIN_OUTPUT: True
RUN_ENV: local
PORT: 8080
# - name: Upload coverage reports to Codecov
# uses: codecov/codecov-action@v4
# with:
# token: ${{ secrets.CODECOV_TOKEN }}
# flags: backend,${{ runner.os }}

View File

@ -2,6 +2,7 @@ name: Repo - PR Status Checker
on:
pull_request:
types: [opened, synchronize, reopened]
merge_group:
jobs:
status-check:

View File

@ -7,13 +7,18 @@ from typing import Dict, List, Tuple
CHECK_INTERVAL = 30
def get_environment_variables() -> Tuple[str, str, str, str, str]:
"""Retrieve and return necessary environment variables."""
try:
with open(os.environ["GITHUB_EVENT_PATH"]) as f:
event = json.load(f)
sha = event["pull_request"]["head"]["sha"]
# Handle both PR and merge group events
if "pull_request" in event:
sha = event["pull_request"]["head"]["sha"]
else:
sha = os.environ["GITHUB_SHA"]
return (
os.environ["GITHUB_API_URL"],

5
.gitignore vendored
View File

@ -171,3 +171,8 @@ ig*
.github_access_token
LICENSE.rtf
autogpt_platform/backend/settings.py
/.auth
/autogpt_platform/frontend/.auth
*.ign.*
.test-contents

View File

@ -9,7 +9,7 @@ repos:
- id: check-merge-conflict
- id: check-symlinks
- id: debug-statements
- repo: https://github.com/Yelp/detect-secrets
rev: v1.5.0
hooks:
@ -19,27 +19,122 @@ repos:
files: ^autogpt_platform/
stages: [push]
- repo: local
# For proper type checking, all dependencies need to be up-to-date.
# It's also a good idea to check that poetry.lock is consistent with pyproject.toml.
hooks:
- id: poetry-install
name: Check & Install dependencies - AutoGPT Platform - Backend
alias: poetry-install-platform-backend
entry: poetry -C autogpt_platform/backend install
# include autogpt_libs source (since it's a path dependency)
files: ^autogpt_platform/(backend|autogpt_libs)/poetry\.lock$
types: [file]
language: system
pass_filenames: false
- id: poetry-install
name: Check & Install dependencies - AutoGPT Platform - Libs
alias: poetry-install-platform-libs
entry: poetry -C autogpt_platform/autogpt_libs install
files: ^autogpt_platform/autogpt_libs/poetry\.lock$
types: [file]
language: system
pass_filenames: false
- id: poetry-install
name: Check & Install dependencies - Classic - AutoGPT
alias: poetry-install-classic-autogpt
entry: poetry -C classic/original_autogpt install
# include forge source (since it's a path dependency)
files: ^classic/(original_autogpt|forge)/poetry\.lock$
types: [file]
language: system
pass_filenames: false
- id: poetry-install
name: Check & Install dependencies - Classic - Forge
alias: poetry-install-classic-forge
entry: poetry -C classic/forge install
files: ^classic/forge/poetry\.lock$
types: [file]
language: system
pass_filenames: false
- id: poetry-install
name: Check & Install dependencies - Classic - Benchmark
alias: poetry-install-classic-benchmark
entry: poetry -C classic/benchmark install
files: ^classic/benchmark/poetry\.lock$
types: [file]
language: system
pass_filenames: false
- repo: local
# For proper type checking, Prisma client must be up-to-date.
hooks:
- id: prisma-generate
name: Prisma Generate - AutoGPT Platform - Backend
alias: prisma-generate-platform-backend
entry: bash -c 'cd autogpt_platform/backend && poetry run prisma generate'
# include everything that triggers poetry install + the prisma schema
files: ^autogpt_platform/((backend|autogpt_libs)/poetry\.lock|backend/schema.prisma)$
types: [file]
language: system
pass_filenames: false
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.7.2
hooks:
- id: ruff
name: Lint (Ruff) - AutoGPT Platform - Backend
alias: ruff-lint-platform-backend
files: ^autogpt_platform/backend/
args: [--fix]
- id: ruff
name: Lint (Ruff) - AutoGPT Platform - Libs
alias: ruff-lint-platform-libs
files: ^autogpt_platform/autogpt_libs/
args: [--fix]
- id: ruff-format
name: Format (Ruff) - AutoGPT Platform - Libs
alias: ruff-lint-platform-libs
files: ^autogpt_platform/autogpt_libs/
- repo: local
# isort needs the context of which packages are installed to function, so we
# can't use a vendored isort pre-commit hook (which runs in its own isolated venv).
hooks:
- id: isort-autogpt
name: Lint (isort) - AutoGPT
entry: poetry -C classic/original_autogpt run isort
- id: isort
name: Lint (isort) - AutoGPT Platform - Backend
alias: isort-platform-backend
entry: poetry -P autogpt_platform/backend run isort -p backend
files: ^autogpt_platform/backend/
types: [file, python]
language: system
- id: isort
name: Lint (isort) - Classic - AutoGPT
alias: isort-classic-autogpt
entry: poetry -P classic/original_autogpt run isort -p autogpt
files: ^classic/original_autogpt/
types: [file, python]
language: system
- id: isort-forge
name: Lint (isort) - Forge
entry: poetry -C classic/forge run isort
- id: isort
name: Lint (isort) - Classic - Forge
alias: isort-classic-forge
entry: poetry -P classic/forge run isort -p forge
files: ^classic/forge/
types: [file, python]
language: system
- id: isort-benchmark
name: Lint (isort) - Benchmark
entry: poetry -C classic/benchmark run isort
- id: isort
name: Lint (isort) - Classic - Benchmark
alias: isort-classic-benchmark
entry: poetry -P classic/benchmark run isort -p agbenchmark
files: ^classic/benchmark/
types: [file, python]
language: system
@ -50,8 +145,7 @@ repos:
# everything in .gitignore, so it works fine without any config or arguments.
hooks:
- id: black
name: Lint (Black)
language_version: python3.12
name: Format (Black)
- repo: https://github.com/PyCQA/flake8
rev: 7.0.0
@ -59,20 +153,20 @@ repos:
# them separately.
hooks:
- id: flake8
name: Lint (Flake8) - AutoGPT
alias: flake8-autogpt
name: Lint (Flake8) - Classic - AutoGPT
alias: flake8-classic-autogpt
files: ^classic/original_autogpt/(autogpt|scripts|tests)/
args: [--config=classic/original_autogpt/.flake8]
- id: flake8
name: Lint (Flake8) - Forge
alias: flake8-forge
name: Lint (Flake8) - Classic - Forge
alias: flake8-classic-forge
files: ^classic/forge/(forge|tests)/
args: [--config=classic/forge/.flake8]
- id: flake8
name: Lint (Flake8) - Benchmark
alias: flake8-benchmark
name: Lint (Flake8) - Classic - Benchmark
alias: flake8-classic-benchmark
files: ^classic/benchmark/(agbenchmark|tests)/((?!reports).)*[/.]
args: [--config=classic/benchmark/.flake8]
@ -81,31 +175,47 @@ repos:
# project. To trigger on poetry.lock we also reset the file `types` filter.
hooks:
- id: pyright
name: Typecheck - AutoGPT
alias: pyright-autogpt
entry: poetry -C classic/original_autogpt run pyright
args: [-p, autogpt, autogpt]
name: Typecheck - AutoGPT Platform - Backend
alias: pyright-platform-backend
entry: poetry -C autogpt_platform/backend run pyright
# include forge source (since it's a path dependency) but exclude *_test.py files:
files: ^(classic/original_autogpt/((autogpt|scripts|tests)/|poetry\.lock$)|classic/forge/(classic/forge/.*(?<!_test)\.py|poetry\.lock)$)
files: ^autogpt_platform/(backend/((backend|test)/|(\w+\.py|poetry\.lock)$)|autogpt_libs/(autogpt_libs/.*(?<!_test)\.py|poetry\.lock)$)
types: [file]
language: system
pass_filenames: false
- id: pyright
name: Typecheck - Forge
alias: pyright-forge
name: Typecheck - AutoGPT Platform - Libs
alias: pyright-platform-libs
entry: poetry -C autogpt_platform/autogpt_libs run pyright
files: ^autogpt_platform/autogpt_libs/(autogpt_libs/|poetry\.lock$)
types: [file]
language: system
pass_filenames: false
- id: pyright
name: Typecheck - Classic - AutoGPT
alias: pyright-classic-autogpt
entry: poetry -C classic/original_autogpt run pyright
# include forge source (since it's a path dependency) but exclude *_test.py files:
files: ^(classic/original_autogpt/((autogpt|scripts|tests)/|poetry\.lock$)|classic/forge/(forge/.*(?<!_test)\.py|poetry\.lock)$)
types: [file]
language: system
pass_filenames: false
- id: pyright
name: Typecheck - Classic - Forge
alias: pyright-classic-forge
entry: poetry -C classic/forge run pyright
args: [-p, forge, forge]
files: ^classic/forge/(classic/forge/|poetry\.lock$)
files: ^classic/forge/(forge/|poetry\.lock$)
types: [file]
language: system
pass_filenames: false
- id: pyright
name: Typecheck - Benchmark
alias: pyright-benchmark
name: Typecheck - Classic - Benchmark
alias: pyright-classic-benchmark
entry: poetry -C classic/benchmark run pyright
args: [-p, benchmark, benchmark]
files: ^classic/benchmark/(agbenchmark/|tests/|poetry\.lock$)
types: [file]
language: system
@ -113,23 +223,35 @@ repos:
- repo: local
hooks:
- id: pytest-autogpt
name: Run tests - AutoGPT (excl. slow tests)
- id: pytest
name: Run tests - AutoGPT Platform - Backend
alias: pytest-platform-backend
entry: bash -c 'cd autogpt_platform/backend && poetry run pytest'
# include autogpt_libs source (since it's a path dependency) but exclude *_test.py files:
files: ^autogpt_platform/(backend/((backend|test)/|poetry\.lock$)|autogpt_libs/(autogpt_libs/.*(?<!_test)\.py|poetry\.lock)$)
language: system
pass_filenames: false
- id: pytest
name: Run tests - Classic - AutoGPT (excl. slow tests)
alias: pytest-classic-autogpt
entry: bash -c 'cd classic/original_autogpt && poetry run pytest --cov=autogpt -m "not slow" tests/unit tests/integration'
# include forge source (since it's a path dependency) but exclude *_test.py files:
files: ^(classic/original_autogpt/((autogpt|tests)/|poetry\.lock$)|classic/forge/(classic/forge/.*(?<!_test)\.py|poetry\.lock)$)
files: ^(classic/original_autogpt/((autogpt|tests)/|poetry\.lock$)|classic/forge/(forge/.*(?<!_test)\.py|poetry\.lock)$)
language: system
pass_filenames: false
- id: pytest-forge
name: Run tests - Forge (excl. slow tests)
- id: pytest
name: Run tests - Classic - Forge (excl. slow tests)
alias: pytest-classic-forge
entry: bash -c 'cd classic/forge && poetry run pytest --cov=forge -m "not slow"'
files: ^classic/forge/(classic/forge/|tests/|poetry\.lock$)
files: ^classic/forge/(forge/|tests/|poetry\.lock$)
language: system
pass_filenames: false
- id: pytest-benchmark
name: Run tests - Benchmark
- id: pytest
name: Run tests - Classic - Benchmark
alias: pytest-classic-benchmark
entry: bash -c 'cd classic/benchmark && poetry run pytest --cov=benchmark'
files: ^classic/benchmark/(agbenchmark/|tests/|poetry\.lock$)
language: system

67
.vscode/launch.json vendored Normal file
View File

@ -0,0 +1,67 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Frontend: Server Side",
"type": "node-terminal",
"request": "launch",
"cwd": "${workspaceFolder}/autogpt_platform/frontend",
"command": "yarn dev"
},
{
"name": "Frontend: Client Side",
"type": "msedge",
"request": "launch",
"url": "http://localhost:3000"
},
{
"name": "Frontend: Full Stack",
"type": "node-terminal",
"request": "launch",
"command": "yarn dev",
"cwd": "${workspaceFolder}/autogpt_platform/frontend",
"serverReadyAction": {
"pattern": "- Local:.+(https?://.+)",
"uriFormat": "%s",
"action": "debugWithEdge"
}
},
{
"name": "Backend",
"type": "debugpy",
"request": "launch",
"module": "backend.app",
// "env": {
// "ENV": "dev"
// },
"envFile": "${workspaceFolder}/backend/.env",
"justMyCode": false,
"cwd": "${workspaceFolder}/autogpt_platform/backend"
},
{
"name": "Marketplace",
"type": "debugpy",
"request": "launch",
"module": "autogpt_platform.market.main",
"env": {
"ENV": "dev"
},
"envFile": "${workspaceFolder}/market/.env",
"justMyCode": false,
"cwd": "${workspaceFolder}/market"
}
],
"compounds": [
{
"name": "Everything",
"configurations": ["Backend", "Frontend: Full Stack"],
// "preLaunchTask": "${defaultBuildTask}",
"stopAll": true,
"presentation": {
"hidden": false,
"order": 0
}
}
]
}

View File

@ -35,7 +35,7 @@ The AutoGPT frontend is where users interact with our powerful AI automation pla
**Monitoring and Analytics:** Keep track of your agents' performance and gain insights to continually improve your automation processes.
[Read this guide](https://docs.agpt.co/server/new_blocks/) to learn how to build your own custom blocks.
[Read this guide](https://docs.agpt.co/platform/new_blocks/) to learn how to build your own custom blocks.
### 💽 AutoGPT Server

47
SECURITY.md Normal file
View File

@ -0,0 +1,47 @@
# Security Policy
## Reporting Security Issues
We take the security of our project seriously. If you believe you have found a security vulnerability, please report it to us privately. **Please do not report security vulnerabilities through public GitHub issues, discussions, or pull requests.**
> **Important Note**: Any code within the `classic/` folder is considered legacy, unsupported, and out of scope for security reports. We will not address security vulnerabilities in this deprecated code.
Instead, please report them via:
- [GitHub Security Advisory](https://github.com/Significant-Gravitas/AutoGPT/security/advisories/new)
<!--- [Huntr.dev](https://huntr.com/repos/significant-gravitas/autogpt) - where you may be eligible for a bounty-->
### Reporting Process
1. **Submit Report**: Use one of the above channels to submit your report
2. **Response Time**: Our team will acknowledge receipt of your report within 14 business days.
3. **Collaboration**: We will collaborate with you to understand and validate the issue
4. **Resolution**: We will work on a fix and coordinate the release process
### Disclosure Policy
- Please provide detailed reports with reproducible steps
- Include the version/commit hash where you discovered the vulnerability
- Allow us a 90-day security fix window before any public disclosure
- Share any potential mitigations or workarounds if known
## Supported Versions
Only the following versions are eligible for security updates:
| Version | Supported |
|---------|-----------|
| Latest release on master branch | ✅ |
| Development commits (pre-master) | ✅ |
| Classic folder (deprecated) | ❌ |
| All other versions | ❌ |
## Security Best Practices
When using this project:
1. Always use the latest stable version
2. Review security advisories before updating
3. Follow our security documentation and guidelines
4. Keep your dependencies up to date
5. Do not use code from the `classic/` folder as it is deprecated and unsupported
## Past Security Advisories
For a list of past security advisories, please visit our [Security Advisory Page](https://github.com/Significant-Gravitas/AutoGPT/security/advisories) and [Huntr Disclosures Page](https://huntr.com/repos/significant-gravitas/autogpt).
---
Last updated: November 2024

2
autogpt_platform/.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
*.ignore.*
*.ign.*

View File

@ -0,0 +1,34 @@
import hashlib
import secrets
from typing import NamedTuple
class APIKeyContainer(NamedTuple):
"""Container for API key parts."""
raw: str
prefix: str
postfix: str
hash: str
class APIKeyManager:
PREFIX: str = "agpt_"
PREFIX_LENGTH: int = 8
POSTFIX_LENGTH: int = 8
def generate_api_key(self) -> APIKeyContainer:
"""Generate a new API key with all its parts."""
raw_key = f"{self.PREFIX}{secrets.token_urlsafe(32)}"
return APIKeyContainer(
raw=raw_key,
prefix=raw_key[: self.PREFIX_LENGTH],
postfix=raw_key[-self.POSTFIX_LENGTH :],
hash=hashlib.sha256(raw_key.encode()).hexdigest(),
)
def verify_api_key(self, provided_key: str, stored_hash: str) -> bool:
"""Verify if a provided API key matches the stored hash."""
if not provided_key.startswith(self.PREFIX):
return False
return hashlib.sha256(provided_key.encode()).hexdigest() == stored_hash

View File

@ -1,7 +1,8 @@
import fastapi
from .config import Settings
from .middleware import auth_middleware
from .models import User
from .models import DEFAULT_USER_ID, User
def requires_user(payload: dict = fastapi.Depends(auth_middleware)) -> User:
@ -16,8 +17,12 @@ def requires_admin_user(
def verify_user(payload: dict | None, admin_only: bool) -> User:
if not payload:
if Settings.ENABLE_AUTH:
raise fastapi.HTTPException(
status_code=401, detail="Authorization header is missing"
)
# This handles the case when authentication is disabled
payload = {"sub": "3e53486c-cf57-477e-ba2a-cb02dc828e1a", "role": "admin"}
payload = {"sub": DEFAULT_USER_ID, "role": "admin"}
user_id = payload.get("sub")
@ -30,3 +35,12 @@ def verify_user(payload: dict | None, admin_only: bool) -> User:
raise fastapi.HTTPException(status_code=403, detail="Admin access required")
return User.from_payload(payload)
def get_user_id(payload: dict = fastapi.Depends(auth_middleware)) -> str:
user_id = payload.get("sub")
if not user_id:
raise fastapi.HTTPException(
status_code=401, detail="User ID not found in token"
)
return user_id

View File

@ -1,5 +1,8 @@
from dataclasses import dataclass
DEFAULT_USER_ID = "3e53486c-cf57-477e-ba2a-cb02dc828e1a"
DEFAULT_EMAIL = "default@example.com"
# Using dataclass here to avoid adding dependency on pydantic
@dataclass(frozen=True)

View File

@ -0,0 +1,167 @@
import asyncio
import contextlib
import logging
from functools import wraps
from typing import Any, Awaitable, Callable, Dict, Optional, TypeVar, Union, cast
import ldclient
from fastapi import HTTPException
from ldclient import Context, LDClient
from ldclient.config import Config
from typing_extensions import ParamSpec
from .config import SETTINGS
logger = logging.getLogger(__name__)
logging.basicConfig(level=logging.DEBUG)
P = ParamSpec("P")
T = TypeVar("T")
def get_client() -> LDClient:
"""Get the LaunchDarkly client singleton."""
return ldclient.get()
def initialize_launchdarkly() -> None:
sdk_key = SETTINGS.launch_darkly_sdk_key
logger.debug(
f"Initializing LaunchDarkly with SDK key: {'present' if sdk_key else 'missing'}"
)
if not sdk_key:
logger.warning("LaunchDarkly SDK key not configured")
return
config = Config(sdk_key)
ldclient.set_config(config)
if ldclient.get().is_initialized():
logger.info("LaunchDarkly client initialized successfully")
else:
logger.error("LaunchDarkly client failed to initialize")
def shutdown_launchdarkly() -> None:
"""Shutdown the LaunchDarkly client."""
if ldclient.get().is_initialized():
ldclient.get().close()
logger.info("LaunchDarkly client closed successfully")
def create_context(
user_id: str, additional_attributes: Optional[Dict[str, Any]] = None
) -> Context:
"""Create LaunchDarkly context with optional additional attributes."""
builder = Context.builder(str(user_id)).kind("user")
if additional_attributes:
for key, value in additional_attributes.items():
builder.set(key, value)
return builder.build()
def feature_flag(
flag_key: str,
default: bool = False,
) -> Callable[
[Callable[P, Union[T, Awaitable[T]]]], Callable[P, Union[T, Awaitable[T]]]
]:
"""
Decorator for feature flag protected endpoints.
"""
def decorator(
func: Callable[P, Union[T, Awaitable[T]]],
) -> Callable[P, Union[T, Awaitable[T]]]:
@wraps(func)
async def async_wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
try:
user_id = kwargs.get("user_id")
if not user_id:
raise ValueError("user_id is required")
if not get_client().is_initialized():
logger.warning(
f"LaunchDarkly not initialized, using default={default}"
)
is_enabled = default
else:
context = create_context(str(user_id))
is_enabled = get_client().variation(flag_key, context, default)
if not is_enabled:
raise HTTPException(status_code=404, detail="Feature not available")
result = func(*args, **kwargs)
if asyncio.iscoroutine(result):
return await result
return cast(T, result)
except Exception as e:
logger.error(f"Error evaluating feature flag {flag_key}: {e}")
raise
@wraps(func)
def sync_wrapper(*args: P.args, **kwargs: P.kwargs) -> T:
try:
user_id = kwargs.get("user_id")
if not user_id:
raise ValueError("user_id is required")
if not get_client().is_initialized():
logger.warning(
f"LaunchDarkly not initialized, using default={default}"
)
is_enabled = default
else:
context = create_context(str(user_id))
is_enabled = get_client().variation(flag_key, context, default)
if not is_enabled:
raise HTTPException(status_code=404, detail="Feature not available")
return cast(T, func(*args, **kwargs))
except Exception as e:
logger.error(f"Error evaluating feature flag {flag_key}: {e}")
raise
return cast(
Callable[P, Union[T, Awaitable[T]]],
async_wrapper if asyncio.iscoroutinefunction(func) else sync_wrapper,
)
return decorator
def percentage_rollout(
flag_key: str,
default: bool = False,
) -> Callable[
[Callable[P, Union[T, Awaitable[T]]]], Callable[P, Union[T, Awaitable[T]]]
]:
"""Decorator for percentage-based rollouts."""
return feature_flag(flag_key, default)
def beta_feature(
flag_key: Optional[str] = None,
unauthorized_response: Any = {"message": "Not available in beta"},
) -> Callable[
[Callable[P, Union[T, Awaitable[T]]]], Callable[P, Union[T, Awaitable[T]]]
]:
"""Decorator for beta features."""
actual_key = f"beta-{flag_key}" if flag_key else "beta"
return feature_flag(actual_key, False)
@contextlib.contextmanager
def mock_flag_variation(flag_key: str, return_value: Any):
"""Context manager for testing feature flags."""
original_variation = get_client().variation
get_client().variation = lambda key, context, default: (
return_value if key == flag_key else original_variation(key, context, default)
)
try:
yield
finally:
get_client().variation = original_variation

View File

@ -0,0 +1,45 @@
import pytest
from ldclient import LDClient
from autogpt_libs.feature_flag.client import feature_flag, mock_flag_variation
@pytest.fixture
def ld_client(mocker):
client = mocker.Mock(spec=LDClient)
mocker.patch("ldclient.get", return_value=client)
client.is_initialized.return_value = True
return client
@pytest.mark.asyncio
async def test_feature_flag_enabled(ld_client):
ld_client.variation.return_value = True
@feature_flag("test-flag")
async def test_function(user_id: str):
return "success"
result = test_function(user_id="test-user")
assert result == "success"
ld_client.variation.assert_called_once()
@pytest.mark.asyncio
async def test_feature_flag_unauthorized_response(ld_client):
ld_client.variation.return_value = False
@feature_flag("test-flag")
async def test_function(user_id: str):
return "success"
result = test_function(user_id="test-user")
assert result == {"error": "disabled"}
def test_mock_flag_variation(ld_client):
with mock_flag_variation("test-flag", True):
assert ld_client.variation("test-flag", None, False)
with mock_flag_variation("test-flag", False):
assert ld_client.variation("test-flag", None, False)

View File

@ -0,0 +1,15 @@
from pydantic import Field
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
launch_darkly_sdk_key: str = Field(
default="",
description="The Launch Darkly SDK key",
validation_alias="LAUNCH_DARKLY_SDK_KEY",
)
model_config = SettingsConfigDict(case_sensitive=True, extra="ignore")
SETTINGS = Settings()

View File

@ -6,6 +6,7 @@ from pathlib import Path
from pydantic import Field, field_validator
from pydantic_settings import BaseSettings, SettingsConfigDict
from .filters import BelowLevelFilter
from .formatters import AGPTFormatter, StructuredLoggingFormatter
@ -22,7 +23,6 @@ DEBUG_LOG_FORMAT = (
class LoggingConfig(BaseSettings):
level: str = Field(
default="INFO",
description="Logging level",

View File

@ -24,10 +24,10 @@ from .utils import remove_color_codes
),
("", ""),
("hello", "hello"),
("hello\x1B[31m world", "hello world"),
("\x1B[36mHello,\x1B[32m World!", "Hello, World!"),
("hello\x1b[31m world", "hello world"),
("\x1b[36mHello,\x1b[32m World!", "Hello, World!"),
(
"\x1B[1m\x1B[31mError:\x1B[0m\x1B[31m file not found",
"\x1b[1m\x1b[31mError:\x1b[0m\x1b[31m file not found",
"Error: file not found",
),
],

View File

@ -0,0 +1,31 @@
from pydantic import Field
from pydantic_settings import BaseSettings, SettingsConfigDict
class RateLimitSettings(BaseSettings):
redis_host: str = Field(
default="redis://localhost:6379",
description="Redis host",
validation_alias="REDIS_HOST",
)
redis_port: str = Field(
default="6379", description="Redis port", validation_alias="REDIS_PORT"
)
redis_password: str = Field(
default="password",
description="Redis password",
validation_alias="REDIS_PASSWORD",
)
requests_per_minute: int = Field(
default=60,
description="Maximum number of requests allowed per minute per API key",
validation_alias="RATE_LIMIT_REQUESTS_PER_MINUTE",
)
model_config = SettingsConfigDict(case_sensitive=True, extra="ignore")
RATE_LIMIT_SETTINGS = RateLimitSettings()

View File

@ -0,0 +1,51 @@
import time
from typing import Tuple
from redis import Redis
from .config import RATE_LIMIT_SETTINGS
class RateLimiter:
def __init__(
self,
redis_host: str = RATE_LIMIT_SETTINGS.redis_host,
redis_port: str = RATE_LIMIT_SETTINGS.redis_port,
redis_password: str = RATE_LIMIT_SETTINGS.redis_password,
requests_per_minute: int = RATE_LIMIT_SETTINGS.requests_per_minute,
):
self.redis = Redis(
host=redis_host,
port=int(redis_port),
password=redis_password,
decode_responses=True,
)
self.window = 60
self.max_requests = requests_per_minute
async def check_rate_limit(self, api_key_id: str) -> Tuple[bool, int, int]:
"""
Check if request is within rate limits.
Args:
api_key_id: The API key identifier to check
Returns:
Tuple of (is_allowed, remaining_requests, reset_time)
"""
now = time.time()
window_start = now - self.window
key = f"ratelimit:{api_key_id}:1min"
pipe = self.redis.pipeline()
pipe.zremrangebyscore(key, 0, window_start)
pipe.zadd(key, {str(now): now})
pipe.zcount(key, window_start, now)
pipe.expire(key, self.window)
_, _, request_count, _ = pipe.execute()
remaining = max(0, self.max_requests - request_count)
reset_time = int(now + self.window)
return request_count <= self.max_requests, remaining, reset_time

View File

@ -0,0 +1,32 @@
from fastapi import HTTPException, Request
from starlette.middleware.base import RequestResponseEndpoint
from .limiter import RateLimiter
async def rate_limit_middleware(request: Request, call_next: RequestResponseEndpoint):
"""FastAPI middleware for rate limiting API requests."""
limiter = RateLimiter()
if not request.url.path.startswith("/api"):
return await call_next(request)
api_key = request.headers.get("Authorization")
if not api_key:
return await call_next(request)
api_key = api_key.replace("Bearer ", "")
is_allowed, remaining, reset_time = await limiter.check_rate_limit(api_key)
if not is_allowed:
raise HTTPException(
status_code=429, detail="Rate limit exceeded. Please try again later."
)
response = await call_next(request)
response.headers["X-RateLimit-Limit"] = str(limiter.max_requests)
response.headers["X-RateLimit-Remaining"] = str(remaining)
response.headers["X-RateLimit-Reset"] = str(reset_time)
return response

View File

@ -1,9 +0,0 @@
from .store import SupabaseIntegrationCredentialsStore
from .types import Credentials, APIKeyCredentials, OAuth2Credentials
__all__ = [
"SupabaseIntegrationCredentialsStore",
"Credentials",
"APIKeyCredentials",
"OAuth2Credentials",
]

View File

@ -56,6 +56,7 @@ class OAuthState(BaseModel):
token: str
provider: str
expires_at: int
code_verifier: Optional[str] = None
scopes: list[str]
"""Unix timestamp (seconds) indicating when this OAuth state expires"""

View File

@ -1,5 +1,5 @@
from typing import Callable, TypeVar, ParamSpec
import threading
from typing import Callable, ParamSpec, TypeVar
P = ParamSpec("P")
R = TypeVar("R")

View File

@ -31,7 +31,8 @@ class RedisKeyedMutex:
try:
yield
finally:
lock.release()
if lock.locked():
lock.release()
def acquire(self, key: Any) -> "RedisLock":
"""Acquires and returns a lock with the given key"""
@ -45,7 +46,7 @@ class RedisKeyedMutex:
return lock
def release(self, key: Any):
if lock := self.locks.get(key):
if (lock := self.locks.get(key)) and lock.locked() and lock.owned():
lock.release()
def release_all_locks(self):

View File

@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand.
[[package]]
name = "aiohappyeyeballs"
@ -626,13 +626,13 @@ grpc = ["grpcio (>=1.44.0,<2.0.0.dev0)"]
[[package]]
name = "gotrue"
version = "2.9.3"
version = "2.10.0"
description = "Python Client Library for Supabase Auth"
optional = false
python-versions = "<4.0,>=3.9"
files = [
{file = "gotrue-2.9.3-py3-none-any.whl", hash = "sha256:9d2e9c74405d879f4828e0a7b94daf167a6e109c10ae6e5c59a0e21446f6e423"},
{file = "gotrue-2.9.3.tar.gz", hash = "sha256:051551d80e642bdd2ab42cac78207745d89a2a08f429a1512d82624e675d8255"},
{file = "gotrue-2.10.0-py3-none-any.whl", hash = "sha256:768e58207488e5184ffbdc4351b7280d913daf97962f4e9f2cca05c80004b042"},
{file = "gotrue-2.10.0.tar.gz", hash = "sha256:4edf4c251da3535f2b044e23deba221e848ca1210c17d0c7a9b19f79a1e3f3c0"},
]
[package.dependencies]
@ -854,6 +854,17 @@ doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linke
perf = ["ipython"]
test = ["flufl.flake8", "importlib-resources (>=1.3)", "jaraco.test (>=5.4)", "packaging", "pyfakefs", "pytest (>=6,!=8.1.*)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy", "pytest-perf (>=0.9.2)", "pytest-ruff (>=0.2.1)"]
[[package]]
name = "iniconfig"
version = "2.0.0"
description = "brain-dead simple config-ini parsing"
optional = false
python-versions = ">=3.7"
files = [
{file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"},
{file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"},
]
[[package]]
name = "multidict"
version = "6.1.0"
@ -984,15 +995,30 @@ files = [
{file = "packaging-24.1.tar.gz", hash = "sha256:026ed72c8ed3fcce5bf8950572258698927fd1dbda10a5e981cdf0ac37f4f002"},
]
[[package]]
name = "pluggy"
version = "1.5.0"
description = "plugin and hook calling mechanisms for python"
optional = false
python-versions = ">=3.8"
files = [
{file = "pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669"},
{file = "pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1"},
]
[package.extras]
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "postgrest"
version = "0.17.2"
version = "0.18.0"
description = "PostgREST client for Python. This library provides an ORM interface to PostgREST."
optional = false
python-versions = "<4.0,>=3.9"
files = [
{file = "postgrest-0.17.2-py3-none-any.whl", hash = "sha256:f7c4f448e5a5e2d4c1dcf192edae9d1007c4261e9a6fb5116783a0046846ece2"},
{file = "postgrest-0.17.2.tar.gz", hash = "sha256:445cd4e4a191e279492549df0c4e827d32f9d01d0852599bb8a6efb0f07fcf78"},
{file = "postgrest-0.18.0-py3-none-any.whl", hash = "sha256:200baad0d23fee986b3a0ffd3e07bfe0cdd40e09760f11e8e13a6c0c2376d5fa"},
{file = "postgrest-0.18.0.tar.gz", hash = "sha256:29c1a94801a17eb9ad590189993fe5a7a6d8c1bfc11a3c9d0ce7ba146454ebb3"},
]
[package.dependencies]
@ -1065,22 +1091,19 @@ pyasn1 = ">=0.4.6,<0.7.0"
[[package]]
name = "pydantic"
version = "2.9.2"
version = "2.10.3"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
{file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
{file = "pydantic-2.10.3-py3-none-any.whl", hash = "sha256:be04d85bbc7b65651c5f8e6b9976ed9c6f41782a55524cef079a34a0bb82144d"},
{file = "pydantic-2.10.3.tar.gz", hash = "sha256:cb5ac360ce894ceacd69c403187900a02c4b20b693a9dd1d643e1effab9eadf9"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
pydantic-core = "2.23.4"
typing-extensions = [
{version = ">=4.12.2", markers = "python_version >= \"3.13\""},
{version = ">=4.6.1", markers = "python_version < \"3.13\""},
]
pydantic-core = "2.27.1"
typing-extensions = ">=4.12.2"
[package.extras]
email = ["email-validator (>=2.0.0)"]
@ -1088,100 +1111,111 @@ timezone = ["tzdata"]
[[package]]
name = "pydantic-core"
version = "2.23.4"
version = "2.27.1"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
{file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
{file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
{file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
{file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
{file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
{file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
{file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
{file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
{file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
{file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
{file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
{file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
{file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
{file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
{file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
{file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
{file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
{file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
{file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
{file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
{file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
{file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
{file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
{file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
{file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
{file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
{file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
{file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
{file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
{file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
{file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
{file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
{file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
{file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
{file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
{file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
{file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
{file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
{file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
{file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
{file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
{file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
{file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
{file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a"},
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206"},
{file = "pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c"},
{file = "pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc"},
{file = "pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9"},
{file = "pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5"},
{file = "pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae"},
{file = "pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c"},
{file = "pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16"},
{file = "pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23"},
{file = "pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05"},
{file = "pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337"},
{file = "pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5897bec80a09b4084aee23f9b73a9477a46c3304ad1d2d07acca19723fb1de62"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d0165ab2914379bd56908c02294ed8405c252250668ebcb438a55494c69f44ab"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b9af86e1d8e4cfc82c2022bfaa6f459381a50b94a29e95dcdda8442d6d83864"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f6c8a66741c5f5447e047ab0ba7a1c61d1e95580d64bce852e3df1f895c4067"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a42d6a8156ff78981f8aa56eb6394114e0dedb217cf8b729f438f643608cbcd"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:64c65f40b4cd8b0e049a8edde07e38b476da7e3aaebe63287c899d2cff253fa5"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdcf339322a3fae5cbd504edcefddd5a50d9ee00d968696846f089b4432cf78"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf99c8404f008750c846cb4ac4667b798a9f7de673ff719d705d9b2d6de49c5f"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:8f1edcea27918d748c7e5e4d917297b2a0ab80cad10f86631e488b7cddf76a36"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:159cac0a3d096f79ab6a44d77a961917219707e2a130739c64d4dd46281f5c2a"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:029d9757eb621cc6e1848fa0b0310310de7301057f623985698ed7ebb014391b"},
{file = "pydantic_core-2.27.1-cp38-none-win32.whl", hash = "sha256:a28af0695a45f7060e6f9b7092558a928a28553366519f64083c63a44f70e618"},
{file = "pydantic_core-2.27.1-cp38-none-win_amd64.whl", hash = "sha256:2d4567c850905d5eaaed2f7a404e61012a51caf288292e016360aa2b96ff38d4"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:e9386266798d64eeb19dd3677051f5705bf873e98e15897ddb7d76f477131967"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4228b5b646caa73f119b1ae756216b59cc6e2267201c27d3912b592c5e323b60"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b3dfe500de26c52abe0477dde16192ac39c98f05bf2d80e76102d394bd13854"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aee66be87825cdf72ac64cb03ad4c15ffef4143dbf5c113f64a5ff4f81477bf9"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b748c44bb9f53031c8cbc99a8a061bc181c1000c60a30f55393b6e9c45cc5bd"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ca038c7f6a0afd0b2448941b6ef9d5e1949e999f9e5517692eb6da58e9d44be"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e0bd57539da59a3e4671b90a502da9a28c72322a4f17866ba3ac63a82c4498e"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ac6c2c45c847bbf8f91930d88716a0fb924b51e0c6dad329b793d670ec5db792"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b94d4ba43739bbe8b0ce4262bcc3b7b9f31459ad120fb595627eaeb7f9b9ca01"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:00e6424f4b26fe82d44577b4c842d7df97c20be6439e8e685d0d715feceb9fb9"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:38de0a70160dd97540335b7ad3a74571b24f1dc3ed33f815f0880682e6880131"},
{file = "pydantic_core-2.27.1-cp39-none-win32.whl", hash = "sha256:7ccebf51efc61634f6c2344da73e366c75e735960b5654b63d7e6f69a5885fa3"},
{file = "pydantic_core-2.27.1-cp39-none-win_amd64.whl", hash = "sha256:a57847b090d7892f123726202b7daa20df6694cbd583b67a592e856bff603d6c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5fde892e6c697ce3e30c61b239330fc5d569a71fefd4eb6512fc6caec9dd9e2f"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:816f5aa087094099fff7edabb5e01cc370eb21aa1a1d44fe2d2aefdfb5599b31"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c10c309e18e443ddb108f0ef64e8729363adbfd92d6d57beec680f6261556f3"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98476c98b02c8e9b2eec76ac4156fd006628b1b2d0ef27e548ffa978393fd154"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c3027001c28434e7ca5a6e1e527487051136aa81803ac812be51802150d880dd"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:7699b1df36a48169cdebda7ab5a2bac265204003f153b4bd17276153d997670a"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1c39b07d90be6b48968ddc8c19e7585052088fd7ec8d568bb31ff64c70ae3c97"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:46ccfe3032b3915586e469d4972973f893c0a2bb65669194a5bdea9bacc088c2"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:62ba45e21cf6571d7f716d903b5b7b6d2617e2d5d67c0923dc47b9d41369f840"},
{file = "pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235"},
]
[package.dependencies]
@ -1189,13 +1223,13 @@ typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
[[package]]
name = "pydantic-settings"
version = "2.6.1"
version = "2.7.0"
description = "Settings management using Pydantic"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic_settings-2.6.1-py3-none-any.whl", hash = "sha256:7fb0637c786a558d3103436278a7c4f1cfd29ba8973238a50c5bb9a55387da87"},
{file = "pydantic_settings-2.6.1.tar.gz", hash = "sha256:e0f92546d8a9923cb8941689abf85d6601a8c19a23e97a34b2964a2e3f813ca0"},
{file = "pydantic_settings-2.7.0-py3-none-any.whl", hash = "sha256:e00c05d5fa6cbbb227c84bd7487c5c1065084119b750df7c8c1a554aed236eb5"},
{file = "pydantic_settings-2.7.0.tar.gz", hash = "sha256:ac4bfd4a36831a48dbf8b2d9325425b549a0a6f18cea118436d728eb4f1c4d66"},
]
[package.dependencies]
@ -1209,13 +1243,13 @@ yaml = ["pyyaml (>=6.0.1)"]
[[package]]
name = "pyjwt"
version = "2.9.0"
version = "2.10.1"
description = "JSON Web Token implementation in Python"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
files = [
{file = "PyJWT-2.9.0-py3-none-any.whl", hash = "sha256:3b02fb0f44517787776cf48f2ae25d8e14f300e6d7545a4315cee571a415e850"},
{file = "pyjwt-2.9.0.tar.gz", hash = "sha256:7e1e5b56cc735432a7369cbfa0efe50fa113ebecdc04ae6922deba8b84582d0c"},
{file = "PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb"},
{file = "pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953"},
]
[package.extras]
@ -1224,6 +1258,63 @@ dev = ["coverage[toml] (==5.0.4)", "cryptography (>=3.4.0)", "pre-commit", "pyte
docs = ["sphinx", "sphinx-rtd-theme", "zope.interface"]
tests = ["coverage[toml] (==5.0.4)", "pytest (>=6.0.0,<7.0.0)"]
[[package]]
name = "pytest"
version = "8.3.3"
description = "pytest: simple powerful testing with Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "pytest-8.3.3-py3-none-any.whl", hash = "sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2"},
{file = "pytest-8.3.3.tar.gz", hash = "sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181"},
]
[package.dependencies]
colorama = {version = "*", markers = "sys_platform == \"win32\""}
exceptiongroup = {version = ">=1.0.0rc8", markers = "python_version < \"3.11\""}
iniconfig = "*"
packaging = "*"
pluggy = ">=1.5,<2"
tomli = {version = ">=1", markers = "python_version < \"3.11\""}
[package.extras]
dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "pygments (>=2.7.2)", "requests", "setuptools", "xmlschema"]
[[package]]
name = "pytest-asyncio"
version = "0.25.0"
description = "Pytest support for asyncio"
optional = false
python-versions = ">=3.9"
files = [
{file = "pytest_asyncio-0.25.0-py3-none-any.whl", hash = "sha256:db5432d18eac6b7e28b46dcd9b69921b55c3b1086e85febfe04e70b18d9e81b3"},
{file = "pytest_asyncio-0.25.0.tar.gz", hash = "sha256:8c0610303c9e0442a5db8604505fc0f545456ba1528824842b37b4a626cbf609"},
]
[package.dependencies]
pytest = ">=8.2,<9"
[package.extras]
docs = ["sphinx (>=5.3)", "sphinx-rtd-theme (>=1)"]
testing = ["coverage (>=6.2)", "hypothesis (>=5.7.1)"]
[[package]]
name = "pytest-mock"
version = "3.14.0"
description = "Thin-wrapper around the mock package for easier use with pytest"
optional = false
python-versions = ">=3.8"
files = [
{file = "pytest-mock-3.14.0.tar.gz", hash = "sha256:2719255a1efeceadbc056d6bf3df3d1c5015530fb40cf347c0f9afac88410bd0"},
{file = "pytest_mock-3.14.0-py3-none-any.whl", hash = "sha256:0b72c38033392a5f4621342fe11e9219ac11ec9d375f8e2a0c164539e0d70f6f"},
]
[package.dependencies]
pytest = ">=6.2.5"
[package.extras]
dev = ["pre-commit", "pytest-asyncio", "tox"]
[[package]]
name = "python-dateutil"
version = "2.9.0.post0"
@ -1271,13 +1362,13 @@ websockets = ">=11,<13"
[[package]]
name = "redis"
version = "5.2.0"
version = "5.2.1"
description = "Python client for Redis database and key-value store"
optional = false
python-versions = ">=3.8"
files = [
{file = "redis-5.2.0-py3-none-any.whl", hash = "sha256:ae174f2bb3b1bf2b09d54bf3e51fbc1469cf6c10aa03e21141f51969801a7897"},
{file = "redis-5.2.0.tar.gz", hash = "sha256:0b1087665a771b1ff2e003aa5bdd354f15a70c9e25d5a7dbf9c722c16528a7b0"},
{file = "redis-5.2.1-py3-none-any.whl", hash = "sha256:ee7e1056b9aea0f04c6c2ed59452947f34c4940ee025f5dd83e6a6418b6989e4"},
{file = "redis-5.2.1.tar.gz", hash = "sha256:16f2e22dff21d5125e8481515e386711a34cbec50f0e44413dd7d9c060a54e0f"},
]
[package.dependencies]
@ -1322,6 +1413,33 @@ files = [
[package.dependencies]
pyasn1 = ">=0.1.3"
[[package]]
name = "ruff"
version = "0.8.6"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
files = [
{file = "ruff-0.8.6-py3-none-linux_armv6l.whl", hash = "sha256:defed167955d42c68b407e8f2e6f56ba52520e790aba4ca707a9c88619e580e3"},
{file = "ruff-0.8.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:54799ca3d67ae5e0b7a7ac234baa657a9c1784b48ec954a094da7c206e0365b1"},
{file = "ruff-0.8.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:e88b8f6d901477c41559ba540beeb5a671e14cd29ebd5683903572f4b40a9807"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0509e8da430228236a18a677fcdb0c1f102dd26d5520f71f79b094963322ed25"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:91a7ddb221779871cf226100e677b5ea38c2d54e9e2c8ed847450ebbdf99b32d"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:248b1fb3f739d01d528cc50b35ee9c4812aa58cc5935998e776bf8ed5b251e75"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:bc3c083c50390cf69e7e1b5a5a7303898966be973664ec0c4a4acea82c1d4315"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:52d587092ab8df308635762386f45f4638badb0866355b2b86760f6d3c076188"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:61323159cf21bc3897674e5adb27cd9e7700bab6b84de40d7be28c3d46dc67cf"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ae4478b1471fc0c44ed52a6fb787e641a2ac58b1c1f91763bafbc2faddc5117"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0c000a471d519b3e6cfc9c6680025d923b4ca140ce3e4612d1a2ef58e11f11fe"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:9257aa841e9e8d9b727423086f0fa9a86b6b420fbf4bf9e1465d1250ce8e4d8d"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:45a56f61b24682f6f6709636949ae8cc82ae229d8d773b4c76c09ec83964a95a"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:496dd38a53aa173481a7d8866bcd6451bd934d06976a2505028a50583e001b76"},
{file = "ruff-0.8.6-py3-none-win32.whl", hash = "sha256:e169ea1b9eae61c99b257dc83b9ee6c76f89042752cb2d83486a7d6e48e8f764"},
{file = "ruff-0.8.6-py3-none-win_amd64.whl", hash = "sha256:f1d70bef3d16fdc897ee290d7d20da3cbe4e26349f62e8a0274e7a3f4ce7a905"},
{file = "ruff-0.8.6-py3-none-win_arm64.whl", hash = "sha256:7d7fc2377a04b6e04ffe588caad613d0c460eb2ecba4c0ccbbfe2bc973cbc162"},
{file = "ruff-0.8.6.tar.gz", hash = "sha256:dcad24b81b62650b0eb8814f576fc65cfee8674772a6e24c9b747911801eeaa5"},
]
[[package]]
name = "six"
version = "1.16.0"
@ -1346,19 +1464,18 @@ files = [
[[package]]
name = "storage3"
version = "0.8.2"
version = "0.9.0"
description = "Supabase Storage client for Python."
optional = false
python-versions = "<4.0,>=3.9"
files = [
{file = "storage3-0.8.2-py3-none-any.whl", hash = "sha256:f2e995b18c77a2a9265d1a33047d43e4d6abb11eb3ca5067959f68281c305de3"},
{file = "storage3-0.8.2.tar.gz", hash = "sha256:db05d3fe8fb73bd30c814c4c4749664f37a5dfc78b629e8c058ef558c2b89f5a"},
{file = "storage3-0.9.0-py3-none-any.whl", hash = "sha256:8b2fb91f0c61583a2f4eac74a8bae67e00d41ff38095c8a6cd3f2ce5e0ab76e7"},
{file = "storage3-0.9.0.tar.gz", hash = "sha256:e16697f60894c94e1d9df0d2e4af783c1b3f7dd08c9013d61978825c624188c4"},
]
[package.dependencies]
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
python-dateutil = ">=2.8.2,<3.0.0"
typing-extensions = ">=4.2.0,<5.0.0"
[[package]]
name = "strenum"
@ -1378,37 +1495,48 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
[[package]]
name = "supabase"
version = "2.9.1"
version = "2.10.0"
description = "Supabase client for Python."
optional = false
python-versions = "<4.0,>=3.9"
files = [
{file = "supabase-2.9.1-py3-none-any.whl", hash = "sha256:a96f857a465712cb551679c1df66ba772c834f861756ce4aa2aa4cb703f6aeb7"},
{file = "supabase-2.9.1.tar.gz", hash = "sha256:51fce39c9eb50573126dabb342541ec5e1f13e7476938768f4b0ccfdb8c522cd"},
{file = "supabase-2.10.0-py3-none-any.whl", hash = "sha256:183fb23c04528593f8f81c24ceb8178f3a56bff40fec7ed873b6c55ebc2e420a"},
{file = "supabase-2.10.0.tar.gz", hash = "sha256:9ac095f8947bf60780e67c0edcbab53e2db3f6f3f022329397b093500bf2607c"},
]
[package.dependencies]
gotrue = ">=2.9.0,<3.0.0"
gotrue = ">=2.10.0,<3.0.0"
httpx = ">=0.26,<0.28"
postgrest = ">=0.17.0,<0.18.0"
postgrest = ">=0.18,<0.19"
realtime = ">=2.0.0,<3.0.0"
storage3 = ">=0.8.0,<0.9.0"
supafunc = ">=0.6.0,<0.7.0"
storage3 = ">=0.9.0,<0.10.0"
supafunc = ">=0.7.0,<0.8.0"
[[package]]
name = "supafunc"
version = "0.6.2"
version = "0.7.0"
description = "Library for Supabase Functions"
optional = false
python-versions = "<4.0,>=3.9"
files = [
{file = "supafunc-0.6.2-py3-none-any.whl", hash = "sha256:101b30616b0a1ce8cf938eca1df362fa4cf1deacb0271f53ebbd674190fb0da5"},
{file = "supafunc-0.6.2.tar.gz", hash = "sha256:c7dfa20db7182f7fe4ae436e94e05c06cd7ed98d697fed75d68c7b9792822adc"},
{file = "supafunc-0.7.0-py3-none-any.whl", hash = "sha256:4160260dc02bdd906be1e2ffd7cb3ae8b74ae437c892bb475352b6a99d9ff8eb"},
{file = "supafunc-0.7.0.tar.gz", hash = "sha256:5b1c415fba1395740b2b4eedd1d786384bd58b98f6333a11ba7889820a48b6a7"},
]
[package.dependencies]
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
[[package]]
name = "tomli"
version = "2.1.0"
description = "A lil' TOML parser"
optional = false
python-versions = ">=3.8"
files = [
{file = "tomli-2.1.0-py3-none-any.whl", hash = "sha256:a5c57c3d1c56f5ccdf89f6523458f60ef716e210fc47c4cfb188c5ba473e0391"},
{file = "tomli-2.1.0.tar.gz", hash = "sha256:3f646cae2aec94e17d04973e4249548320197cfabdf130015d023de4b74d8ab8"},
]
[[package]]
name = "typing-extensions"
version = "4.12.2"
@ -1724,4 +1852,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.10,<4.0"
content-hash = "f80654aae542b1f2f3a44a01f197f87ffbaea52f474dd2cc2b72b8d56b155563"
content-hash = "bf1b0125759dadb1369fff05ffba64fea3e82b9b7a43d0068e1c80974a4ebc1c"

View File

@ -10,16 +10,25 @@ packages = [{ include = "autogpt_libs" }]
colorama = "^0.4.6"
expiringdict = "^1.2.2"
google-cloud-logging = "^3.11.3"
pydantic = "^2.9.2"
pydantic-settings = "^2.6.1"
pyjwt = "^2.8.0"
pydantic = "^2.10.3"
pydantic-settings = "^2.7.0"
pyjwt = "^2.10.1"
pytest-asyncio = "^0.25.0"
pytest-mock = "^3.14.0"
python = ">=3.10,<4.0"
python-dotenv = "^1.0.1"
supabase = "^2.9.1"
supabase = "^2.10.0"
[tool.poetry.group.dev.dependencies]
redis = "^5.2.0"
redis = "^5.2.1"
ruff = "^0.8.6"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.ruff]
line-length = 88
[tool.ruff.lint]
extend-select = ["I"] # sort dependencies

View File

@ -28,8 +28,15 @@ SUPABASE_URL=http://localhost:8000
SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
# For local development, you may need to set FRONTEND_BASE_URL for the OAuth flow for integrations to work.
FRONTEND_BASE_URL=http://localhost:3000
## For local development, you may need to set FRONTEND_BASE_URL for the OAuth flow
## for integrations to work. Defaults to the value of PLATFORM_BASE_URL if not set.
# FRONTEND_BASE_URL=http://localhost:3000
## PLATFORM_BASE_URL must be set to a *publicly accessible* URL pointing to your backend
## to use the platform's webhook-related functionality.
## If you are developing locally, you can use something like ngrok to get a publc URL
## and tunnel it to your locally running backend.
PLATFORM_BASE_URL=https://your-public-url-here
## == INTEGRATION CREDENTIALS == ##
# Each set of server side credentials is required for the corresponding 3rd party
@ -51,12 +58,28 @@ GITHUB_CLIENT_SECRET=
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
# Twitter (X) OAuth 2.0 with PKCE Configuration
# 1. Create a Twitter Developer Account:
# - Visit https://developer.x.com/en and sign up
# 2. Set up your application:
# - Navigate to Developer Portal > Projects > Create Project
# - Add a new app to your project
# 3. Configure app settings:
# - App Permissions: Read + Write + Direct Messages
# - App Type: Web App, Automated App or Bot
# - OAuth 2.0 Callback URL: http://localhost:3000/auth/integrations/oauth_callback
# - Save your Client ID and Client Secret below
TWITTER_CLIENT_ID=
TWITTER_CLIENT_SECRET=
## ===== OPTIONAL API KEYS ===== ##
# LLM
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GROQ_API_KEY=
OPEN_ROUTER_API_KEY=
# Reddit
REDDIT_CLIENT_ID=
@ -98,6 +121,18 @@ REPLICATE_API_KEY=
# Ideogram
IDEOGRAM_API_KEY=
# Fal
FAL_API_KEY=
# Exa
EXA_API_KEY=
# E2B
E2B_API_KEY=
# Nvidia
NVIDIA_API_KEY=
# Logging Configuration
LOG_LEVEL=INFO
ENABLE_CLOUD_LOGGING=false

View File

@ -5,4 +5,7 @@ dev.db-journal
build/
config.json
secrets/*
!secrets/.gitkeep
!secrets/.gitkeep
*.ignore.*
*.ign.*

View File

@ -1,4 +1,4 @@
FROM python:3.11-slim-buster AS builder
FROM python:3.11.10-slim-bookworm AS builder
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
@ -6,17 +6,21 @@ ENV PYTHONUNBUFFERED 1
WORKDIR /app
# Install build dependencies
RUN apt-get update \
&& apt-get install -y build-essential curl ffmpeg wget libcurl4-gnutls-dev libexpat1-dev libpq5 gettext libz-dev libssl-dev postgresql-client git \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
RUN echo 'Acquire::http::Pipeline-Depth 0;\nAcquire::http::No-Cache true;\nAcquire::BrokenProxy true;\n' > /etc/apt/apt.conf.d/99fixbadproxy
ENV POETRY_VERSION=1.8.3 \
POETRY_HOME="/opt/poetry" \
POETRY_NO_INTERACTION=1 \
POETRY_VIRTUALENVS_CREATE=false \
PATH="$POETRY_HOME/bin:$PATH"
RUN apt-get update --allow-releaseinfo-change --fix-missing
# Install build dependencies
RUN apt-get install -y build-essential
RUN apt-get install -y libpq5
RUN apt-get install -y libz-dev
RUN apt-get install -y libssl-dev
RUN apt-get install -y postgresql-client
ENV POETRY_HOME=/opt/poetry
ENV POETRY_NO_INTERACTION=1
ENV POETRY_VIRTUALENVS_CREATE=false
ENV PATH=/opt/poetry/bin:$PATH
# Upgrade pip and setuptools to fix security vulnerabilities
RUN pip3 install --upgrade pip setuptools
@ -27,24 +31,20 @@ RUN pip3 install poetry
COPY autogpt_platform/autogpt_libs /app/autogpt_platform/autogpt_libs
COPY autogpt_platform/backend/poetry.lock autogpt_platform/backend/pyproject.toml /app/autogpt_platform/backend/
WORKDIR /app/autogpt_platform/backend
RUN poetry config virtualenvs.create false \
&& poetry install --no-interaction --no-ansi
RUN poetry install --no-ansi --no-root
# Generate Prisma client
COPY autogpt_platform/backend/schema.prisma ./
RUN poetry config virtualenvs.create false \
&& poetry run prisma generate
RUN poetry run prisma generate
FROM python:3.11-slim-buster AS server_dependencies
FROM python:3.11.10-slim-bookworm AS server_dependencies
WORKDIR /app
ENV POETRY_VERSION=1.8.3 \
POETRY_HOME="/opt/poetry" \
ENV POETRY_HOME=/opt/poetry \
POETRY_NO_INTERACTION=1 \
POETRY_VIRTUALENVS_CREATE=false \
PATH="$POETRY_HOME/bin:$PATH"
POETRY_VIRTUALENVS_CREATE=false
ENV PATH=/opt/poetry/bin:$PATH
# Upgrade pip and setuptools to fix security vulnerabilities
RUN pip3 install --upgrade pip setuptools
@ -71,6 +71,7 @@ WORKDIR /app/autogpt_platform/backend
FROM server_dependencies AS server
COPY autogpt_platform/backend /app/autogpt_platform/backend
RUN poetry install --no-ansi --only-root
ENV DATABASE_URL=""
ENV PORT=8000

View File

@ -200,4 +200,4 @@ To add a new agent block, you need to create a new class that inherits from `Blo
* `run` method: the main logic of the block.
* `test_input` & `test_output`: the sample input and output data for the block, which will be used to auto-test the block.
* You can mock the functions declared in the block using the `test_mock` field for your unit tests.
* Once you finish creating the block, you can test it by running `pytest -s test/block/test_block.py`.
* Once you finish creating the block, you can test it by running `poetry run pytest -s test/block/test_block.py`.

View File

@ -15,10 +15,10 @@ modules = [
if f.is_file() and f.name != "__init__.py"
]
for module in modules:
if not re.match("^[a-z_.]+$", module):
if not re.match("^[a-z0-9_.]+$", module):
raise ValueError(
f"Block module {module} error: module name must be lowercase, "
"separated by underscores, and contain only alphabet characters"
"and contain only alphanumeric characters and underscores."
)
importlib.import_module(f".{module}", package=__name__)
@ -60,13 +60,6 @@ for block_cls in all_subclasses(Block):
input_schema = block.input_schema.model_fields
output_schema = block.output_schema.model_fields
# Prevent duplicate field name in input_schema and output_schema
duplicate_field_names = set(input_schema.keys()) & set(output_schema.keys())
if duplicate_field_names:
raise ValueError(
f"{block.name} has duplicate field names in input_schema and output_schema: {duplicate_field_names}"
)
# Make sure `error` field is a string in the output schema
if "error" in output_schema and output_schema["error"].annotation is not str:
raise ValueError(

View File

@ -0,0 +1,104 @@
import logging
from autogpt_libs.utils.cache import thread_cached
from backend.data.block import (
Block,
BlockCategory,
BlockInput,
BlockOutput,
BlockSchema,
BlockType,
get_block,
)
from backend.data.execution import ExecutionStatus
from backend.data.model import SchemaField
logger = logging.getLogger(__name__)
@thread_cached
def get_executor_manager_client():
from backend.executor import ExecutionManager
from backend.util.service import get_service_client
return get_service_client(ExecutionManager)
@thread_cached
def get_event_bus():
from backend.data.execution import RedisExecutionEventBus
return RedisExecutionEventBus()
class AgentExecutorBlock(Block):
class Input(BlockSchema):
user_id: str = SchemaField(description="User ID")
graph_id: str = SchemaField(description="Graph ID")
graph_version: int = SchemaField(description="Graph Version")
data: BlockInput = SchemaField(description="Input data for the graph")
input_schema: dict = SchemaField(description="Input schema for the graph")
output_schema: dict = SchemaField(description="Output schema for the graph")
class Output(BlockSchema):
pass
def __init__(self):
super().__init__(
id="e189baac-8c20-45a1-94a7-55177ea42565",
description="Executes an existing agent inside your agent",
input_schema=AgentExecutorBlock.Input,
output_schema=AgentExecutorBlock.Output,
block_type=BlockType.AGENT,
categories={BlockCategory.AGENT},
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
executor_manager = get_executor_manager_client()
event_bus = get_event_bus()
graph_exec = executor_manager.add_execution(
graph_id=input_data.graph_id,
graph_version=input_data.graph_version,
user_id=input_data.user_id,
data=input_data.data,
)
log_id = f"Graph #{input_data.graph_id}-V{input_data.graph_version}, exec-id: {graph_exec.graph_exec_id}"
logger.info(f"Starting execution of {log_id}")
for event in event_bus.listen(
graph_id=graph_exec.graph_id, graph_exec_id=graph_exec.graph_exec_id
):
logger.info(
f"Execution {log_id} produced input {event.input_data} output {event.output_data}"
)
if not event.node_id:
if event.status in [
ExecutionStatus.COMPLETED,
ExecutionStatus.TERMINATED,
ExecutionStatus.FAILED,
]:
logger.info(f"Execution {log_id} ended with status {event.status}")
break
else:
continue
if not event.block_id:
logger.warning(f"{log_id} received event without block_id {event}")
continue
block = get_block(event.block_id)
if not block or block.block_type != BlockType.OUTPUT:
continue
output_name = event.input_data.get("name")
if not output_name:
logger.warning(f"{log_id} produced an output with no name {event}")
continue
for output_data in event.output_data.get("output", []):
logger.info(f"Execution {log_id} produced {output_name}: {output_data}")
yield output_name, output_data

View File

@ -0,0 +1,325 @@
from enum import Enum
from typing import Literal
import replicate
from pydantic import SecretStr
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockSchema
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
class ImageSize(str, Enum):
"""
Semantic sizes that map reliably across all models
"""
SQUARE = "square" # For profile pictures, icons, etc.
LANDSCAPE = "landscape" # For traditional photos, scenes
PORTRAIT = "portrait" # For vertical photos, portraits
WIDE = "wide" # For cinematic, desktop wallpapers
TALL = "tall" # For mobile wallpapers, stories
# Mapping semantic sizes to model-specific formats
SIZE_TO_SD_RATIO = {
ImageSize.SQUARE: "1:1",
ImageSize.LANDSCAPE: "4:3",
ImageSize.PORTRAIT: "3:4",
ImageSize.WIDE: "16:9",
ImageSize.TALL: "9:16",
}
SIZE_TO_FLUX_RATIO = {
ImageSize.SQUARE: "1:1",
ImageSize.LANDSCAPE: "4:3",
ImageSize.PORTRAIT: "3:4",
ImageSize.WIDE: "16:9",
ImageSize.TALL: "9:16",
}
SIZE_TO_FLUX_DIMENSIONS = {
ImageSize.SQUARE: (1024, 1024),
ImageSize.LANDSCAPE: (1365, 1024),
ImageSize.PORTRAIT: (1024, 1365),
ImageSize.WIDE: (1440, 810), # Adjusted to maintain 16:9 within 1440 limit
ImageSize.TALL: (810, 1440), # Adjusted to maintain 9:16 within 1440 limit
}
SIZE_TO_RECRAFT_DIMENSIONS = {
ImageSize.SQUARE: "1024x1024",
ImageSize.LANDSCAPE: "1365x1024",
ImageSize.PORTRAIT: "1024x1365",
ImageSize.WIDE: "1536x1024",
ImageSize.TALL: "1024x1536",
}
class ImageStyle(str, Enum):
"""
Complete set of supported styles
"""
ANY = "any"
# Realistic image styles
REALISTIC = "realistic_image"
REALISTIC_BW = "realistic_image/b_and_w"
REALISTIC_HDR = "realistic_image/hdr"
REALISTIC_NATURAL = "realistic_image/natural_light"
REALISTIC_STUDIO = "realistic_image/studio_portrait"
REALISTIC_ENTERPRISE = "realistic_image/enterprise"
REALISTIC_HARD_FLASH = "realistic_image/hard_flash"
REALISTIC_MOTION_BLUR = "realistic_image/motion_blur"
# Digital illustration styles
DIGITAL_ART = "digital_illustration"
PIXEL_ART = "digital_illustration/pixel_art"
HAND_DRAWN = "digital_illustration/hand_drawn"
GRAIN = "digital_illustration/grain"
SKETCH = "digital_illustration/infantile_sketch"
POSTER = "digital_illustration/2d_art_poster"
POSTER_2 = "digital_illustration/2d_art_poster_2"
HANDMADE_3D = "digital_illustration/handmade_3d"
HAND_DRAWN_OUTLINE = "digital_illustration/hand_drawn_outline"
ENGRAVING_COLOR = "digital_illustration/engraving_color"
class ImageGenModel(str, Enum):
"""
Available model providers
"""
FLUX = "Flux 1.1 Pro"
FLUX_ULTRA = "Flux 1.1 Pro Ultra"
RECRAFT = "Recraft v3"
SD3_5 = "Stable Diffusion 3.5 Medium"
class AIImageGeneratorBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
description="Enter your Replicate API key to access the image generation API. You can obtain an API key from https://replicate.com/account/api-tokens.",
)
prompt: str = SchemaField(
description="Text prompt for image generation",
placeholder="e.g., 'A red panda using a laptop in a snowy forest'",
title="Prompt",
)
model: ImageGenModel = SchemaField(
description="The AI model to use for image generation",
default=ImageGenModel.SD3_5,
title="Model",
)
size: ImageSize = SchemaField(
description=(
"Format of the generated image:\n"
"- Square: Perfect for profile pictures, icons\n"
"- Landscape: Traditional photo format\n"
"- Portrait: Vertical photos, portraits\n"
"- Wide: Cinematic format, desktop wallpapers\n"
"- Tall: Mobile wallpapers, social media stories"
),
default=ImageSize.SQUARE,
title="Image Format",
)
style: ImageStyle = SchemaField(
description="Visual style for the generated image",
default=ImageStyle.ANY,
title="Image Style",
)
class Output(BlockSchema):
image_url: str = SchemaField(description="URL of the generated image")
error: str = SchemaField(description="Error message if generation failed")
def __init__(self):
super().__init__(
id="ed1ae7a0-b770-4089-b520-1f0005fad19a",
description="Generate images using various AI models through a unified interface",
categories={BlockCategory.AI},
input_schema=AIImageGeneratorBlock.Input,
output_schema=AIImageGeneratorBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"prompt": "An octopus using a laptop in a snowy forest with 'AutoGPT' clearly visible on the screen",
"model": ImageGenModel.RECRAFT,
"size": ImageSize.SQUARE,
"style": ImageStyle.REALISTIC,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"image_url",
"https://replicate.delivery/generated-image.webp",
),
],
test_mock={
"_run_client": lambda *args, **kwargs: "https://replicate.delivery/generated-image.webp"
},
)
def _run_client(
self, credentials: APIKeyCredentials, model_name: str, input_params: dict
):
try:
# Initialize Replicate client
client = replicate.Client(api_token=credentials.api_key.get_secret_value())
# Run the model with input parameters
output = client.run(model_name, input=input_params, wait=False)
# Process output
if isinstance(output, list) and len(output) > 0:
if isinstance(output[0], FileOutput):
result_url = output[0].url
else:
result_url = output[0]
elif isinstance(output, FileOutput):
result_url = output.url
elif isinstance(output, str):
result_url = output
else:
result_url = None
return result_url
except TypeError as e:
raise TypeError(f"Error during model execution: {e}")
except Exception as e:
raise RuntimeError(f"Unexpected error during model execution: {e}")
def generate_image(self, input_data: Input, credentials: APIKeyCredentials):
try:
# Handle style-based prompt modification for models without native style support
modified_prompt = input_data.prompt
if input_data.model not in [ImageGenModel.RECRAFT]:
style_prefix = self._style_to_prompt_prefix(input_data.style)
modified_prompt = f"{style_prefix} {modified_prompt}".strip()
if input_data.model == ImageGenModel.SD3_5:
# Use Stable Diffusion 3.5 with aspect ratio
input_params = {
"prompt": modified_prompt,
"aspect_ratio": SIZE_TO_SD_RATIO[input_data.size],
"output_format": "webp",
"output_quality": 90,
"steps": 40,
"cfg_scale": 7.0,
}
output = self._run_client(
credentials,
"stability-ai/stable-diffusion-3.5-medium",
input_params,
)
return output
elif input_data.model == ImageGenModel.FLUX:
# Use Flux-specific dimensions with 'jpg' format to avoid ReplicateError
width, height = SIZE_TO_FLUX_DIMENSIONS[input_data.size]
input_params = {
"prompt": modified_prompt,
"width": width,
"height": height,
"aspect_ratio": SIZE_TO_FLUX_RATIO[input_data.size],
"output_format": "jpg", # Set to jpg for Flux models
"output_quality": 90,
}
output = self._run_client(
credentials, "black-forest-labs/flux-1.1-pro", input_params
)
return output
elif input_data.model == ImageGenModel.FLUX_ULTRA:
width, height = SIZE_TO_FLUX_DIMENSIONS[input_data.size]
input_params = {
"prompt": modified_prompt,
"width": width,
"height": height,
"aspect_ratio": SIZE_TO_FLUX_RATIO[input_data.size],
"output_format": "jpg",
"output_quality": 90,
}
output = self._run_client(
credentials, "black-forest-labs/flux-1.1-pro-ultra", input_params
)
return output
elif input_data.model == ImageGenModel.RECRAFT:
input_params = {
"prompt": input_data.prompt,
"size": SIZE_TO_RECRAFT_DIMENSIONS[input_data.size],
"style": input_data.style.value,
}
output = self._run_client(
credentials, "recraft-ai/recraft-v3", input_params
)
return output
except Exception as e:
raise RuntimeError(f"Failed to generate image: {str(e)}")
def _style_to_prompt_prefix(self, style: ImageStyle) -> str:
"""
Convert a style enum to a prompt prefix for models without native style support.
"""
if style == ImageStyle.ANY:
return ""
style_map = {
ImageStyle.REALISTIC: "photorealistic",
ImageStyle.REALISTIC_BW: "black and white photograph",
ImageStyle.REALISTIC_HDR: "HDR photograph",
ImageStyle.REALISTIC_NATURAL: "natural light photograph",
ImageStyle.REALISTIC_STUDIO: "studio portrait photograph",
ImageStyle.REALISTIC_ENTERPRISE: "enterprise photograph",
ImageStyle.REALISTIC_HARD_FLASH: "hard flash photograph",
ImageStyle.REALISTIC_MOTION_BLUR: "motion blur photograph",
ImageStyle.DIGITAL_ART: "digital art",
ImageStyle.PIXEL_ART: "pixel art",
ImageStyle.HAND_DRAWN: "hand drawn illustration",
ImageStyle.GRAIN: "grainy digital illustration",
ImageStyle.SKETCH: "sketchy illustration",
ImageStyle.POSTER: "2D art poster",
ImageStyle.POSTER_2: "alternate 2D art poster",
ImageStyle.HANDMADE_3D: "handmade 3D illustration",
ImageStyle.HAND_DRAWN_OUTLINE: "hand drawn outline illustration",
ImageStyle.ENGRAVING_COLOR: "color engraving illustration",
}
style_text = style_map.get(style, "")
return f"{style_text} of" if style_text else ""
def run(self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs):
try:
url = self.generate_image(input_data, credentials)
if url:
yield "image_url", url
else:
yield "error", "Image generation returned an empty result."
except Exception as e:
# Capture and return only the message of the exception, avoiding serialization of non-serializable objects
yield "error", str(e)
# Test credentials stay the same
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="replicate",
api_key=SecretStr("mock-replicate-api-key"),
title="Mock Replicate API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}

View File

@ -0,0 +1,227 @@
import logging
import time
from enum import Enum
from typing import Literal
import replicate
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
logger = logging.getLogger(__name__)
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="replicate",
api_key=SecretStr("mock-replicate-api-key"),
title="Mock Replicate API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.type,
}
# Model version enum
class MusicGenModelVersion(str, Enum):
STEREO_LARGE = "stereo-large"
MELODY_LARGE = "melody-large"
LARGE = "large"
# Audio format enum
class AudioFormat(str, Enum):
WAV = "wav"
MP3 = "mp3"
# Normalization strategy enum
class NormalizationStrategy(str, Enum):
LOUDNESS = "loudness"
CLIP = "clip"
PEAK = "peak"
RMS = "rms"
class AIMusicGeneratorBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
description="The Replicate integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
)
prompt: str = SchemaField(
description="A description of the music you want to generate",
placeholder="e.g., 'An upbeat electronic dance track with heavy bass'",
title="Prompt",
)
music_gen_model_version: MusicGenModelVersion = SchemaField(
description="Model to use for generation",
default=MusicGenModelVersion.STEREO_LARGE,
title="Model Version",
)
duration: int = SchemaField(
description="Duration of the generated audio in seconds",
default=8,
title="Duration",
)
temperature: float = SchemaField(
description="Controls the 'conservativeness' of the sampling process. Higher temperature means more diversity",
default=1.0,
title="Temperature",
)
top_k: int = SchemaField(
description="Reduces sampling to the k most likely tokens",
default=250,
title="Top K",
)
top_p: float = SchemaField(
description="Reduces sampling to tokens with cumulative probability of p. When set to 0 (default), top_k sampling is used",
default=0.0,
title="Top P",
)
classifier_free_guidance: int = SchemaField(
description="Increases the influence of inputs on the output. Higher values produce lower-variance outputs that adhere more closely to inputs",
default=3,
title="Classifier Free Guidance",
)
output_format: AudioFormat = SchemaField(
description="Output format for generated audio",
default=AudioFormat.WAV,
title="Output Format",
)
normalization_strategy: NormalizationStrategy = SchemaField(
description="Strategy for normalizing audio",
default=NormalizationStrategy.LOUDNESS,
title="Normalization Strategy",
)
class Output(BlockSchema):
result: str = SchemaField(description="URL of the generated audio file")
error: str = SchemaField(description="Error message if the model run failed")
def __init__(self):
super().__init__(
id="44f6c8ad-d75c-4ae1-8209-aad1c0326928",
description="This block generates music using Meta's MusicGen model on Replicate.",
categories={BlockCategory.AI},
input_schema=AIMusicGeneratorBlock.Input,
output_schema=AIMusicGeneratorBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"prompt": "An upbeat electronic dance track with heavy bass",
"music_gen_model_version": MusicGenModelVersion.STEREO_LARGE,
"duration": 8,
"temperature": 1.0,
"top_k": 250,
"top_p": 0.0,
"classifier_free_guidance": 3,
"output_format": AudioFormat.WAV,
"normalization_strategy": NormalizationStrategy.LOUDNESS,
},
test_output=[
(
"result",
"https://replicate.com/output/generated-audio-url.wav",
),
],
test_mock={
"run_model": lambda api_key, music_gen_model_version, prompt, duration, temperature, top_k, top_p, classifier_free_guidance, output_format, normalization_strategy: "https://replicate.com/output/generated-audio-url.wav",
},
test_credentials=TEST_CREDENTIALS,
)
def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
max_retries = 3
retry_delay = 5 # seconds
last_error = None
for attempt in range(max_retries):
try:
logger.debug(
f"[AIMusicGeneratorBlock] - Running model (attempt {attempt + 1})"
)
result = self.run_model(
api_key=credentials.api_key,
music_gen_model_version=input_data.music_gen_model_version,
prompt=input_data.prompt,
duration=input_data.duration,
temperature=input_data.temperature,
top_k=input_data.top_k,
top_p=input_data.top_p,
classifier_free_guidance=input_data.classifier_free_guidance,
output_format=input_data.output_format,
normalization_strategy=input_data.normalization_strategy,
)
if result and result != "No output received":
yield "result", result
return
else:
last_error = "Model returned empty or invalid response"
raise ValueError(last_error)
except Exception as e:
last_error = f"Unexpected error: {str(e)}"
logger.error(f"[AIMusicGeneratorBlock] - Error: {last_error}")
if attempt < max_retries - 1:
time.sleep(retry_delay)
continue
# If we've exhausted all retries, yield the error
yield "error", f"Failed after {max_retries} attempts. Last error: {last_error}"
def run_model(
self,
api_key: SecretStr,
music_gen_model_version: MusicGenModelVersion,
prompt: str,
duration: int,
temperature: float,
top_k: int,
top_p: float,
classifier_free_guidance: int,
output_format: AudioFormat,
normalization_strategy: NormalizationStrategy,
):
# Initialize Replicate client with the API key
client = replicate.Client(api_token=api_key.get_secret_value())
# Run the model with parameters
output = client.run(
"meta/musicgen:671ac645ce5e552cc63a54a2bbff63fcf798043055d2dac5fc9e36a837eedcfb",
input={
"prompt": prompt,
"music_gen_model_version": music_gen_model_version,
"duration": duration,
"temperature": temperature,
"top_k": top_k,
"top_p": top_p,
"classifier_free_guidance": classifier_free_guidance,
"output_format": output_format,
"normalization_strategy": normalization_strategy,
},
)
# Handle the output
if isinstance(output, list) and len(output) > 0:
result_url = output[0] # If output is a list, get the first element
elif isinstance(output, str):
result_url = output # If output is a string, use it directly
else:
result_url = (
"No output received" # Fallback message if output is not as expected
)
return result_url

View File

@ -3,12 +3,17 @@ import time
from enum import Enum
from typing import Literal
import requests
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util.request import requests
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
@ -136,13 +141,11 @@ logger = logging.getLogger(__name__)
class AIShortformVideoCreatorBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[Literal["revid"], Literal["api_key"]] = (
CredentialsField(
provider="revid",
supported_credential_types={"api_key"},
description="The revid.ai integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
)
credentials: CredentialsMetaInput[
Literal[ProviderName.REVID], Literal["api_key"]
] = CredentialsField(
description="The revid.ai integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
)
script: str = SchemaField(
description="""1. Use short and punctuated sentences\n\n2. Use linebreaks to create a new clip\n\n3. Text outside of brackets is spoken by the AI, and [text between brackets] will be used to guide the visual generation. For example, [close-up of a cat] will show a close-up of a cat.""",
@ -217,7 +220,6 @@ class AIShortformVideoCreatorBlock(Block):
url = "https://webhook.site/token"
headers = {"Accept": "application/json", "Content-Type": "application/json"}
response = requests.post(url, headers=headers)
response.raise_for_status()
webhook_data = response.json()
return webhook_data["uuid"], f"https://webhook.site/{webhook_data['uuid']}"
@ -228,14 +230,12 @@ class AIShortformVideoCreatorBlock(Block):
logger.debug(
f"API Response Status Code: {response.status_code}, Content: {response.text}"
)
response.raise_for_status()
return response.json()
def check_video_status(self, api_key: SecretStr, pid: str) -> dict:
url = f"https://www.revid.ai/api/public/v2/status?pid={pid}"
headers = {"key": api_key.get_secret_value()}
response = requests.get(url, headers=headers)
response.raise_for_status()
return response.json()
def wait_for_video(

View File

@ -1,13 +1,11 @@
import re
from typing import Any, List
from jinja2 import BaseLoader, Environment
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema, BlockType
from backend.data.model import SchemaField
from backend.util.mock import MockObject
from backend.util.text import TextFormatter
jinja = Environment(loader=BaseLoader())
formatter = TextFormatter()
class StoreValueBlock(Block):
@ -148,9 +146,12 @@ class AgentInputBlock(Block):
description="The value to be passed as input.",
default=None,
)
description: str = SchemaField(
title: str | None = SchemaField(
description="The title of the input.", default=None, advanced=True
)
description: str | None = SchemaField(
description="The description of the input.",
default="",
default=None,
advanced=True,
)
placeholder_values: List[Any] = SchemaField(
@ -163,6 +164,16 @@ class AgentInputBlock(Block):
default=False,
advanced=True,
)
advanced: bool = SchemaField(
description="Whether to show the input in the advanced section, if the field is not required.",
default=False,
advanced=True,
)
secret: bool = SchemaField(
description="Whether the input should be treated as a secret.",
default=False,
advanced=True,
)
class Output(BlockSchema):
result: Any = SchemaField(description="The value passed as input.")
@ -195,6 +206,7 @@ class AgentInputBlock(Block):
],
categories={BlockCategory.INPUT, BlockCategory.BASIC},
block_type=BlockType.INPUT,
static_output=True,
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
@ -205,35 +217,44 @@ class AgentOutputBlock(Block):
"""
Records the output of the graph for users to see.
Attributes:
recorded_value: The value to be recorded as output.
name: The name of the output.
description: The description of the output.
fmt_string: The format string to be used to format the recorded_value.
Outputs:
output: The formatted recorded_value if fmt_string is provided and the recorded_value
can be formatted, otherwise the raw recorded_value.
Behavior:
If fmt_string is provided and the recorded_value is of a type that can be formatted,
the block attempts to format the recorded_value using the fmt_string.
If formatting fails or no fmt_string is provided, the raw recorded_value is output.
If `format` is provided and the `value` is of a type that can be formatted,
the block attempts to format the recorded_value using the `format`.
If formatting fails or no `format` is provided, the raw `value` is output.
"""
class Input(BlockSchema):
value: Any = SchemaField(description="The value to be recorded as output.")
value: Any = SchemaField(
description="The value to be recorded as output.",
default=None,
advanced=False,
)
name: str = SchemaField(description="The name of the output.")
description: str = SchemaField(
title: str | None = SchemaField(
description="The title of the output.",
default=None,
advanced=True,
)
description: str | None = SchemaField(
description="The description of the output.",
default="",
default=None,
advanced=True,
)
format: str = SchemaField(
description="The format string to be used to format the recorded_value.",
description="The format string to be used to format the recorded_value. Use Jinja2 syntax.",
default="",
advanced=True,
)
advanced: bool = SchemaField(
description="Whether to treat the output as advanced.",
default=False,
advanced=True,
)
secret: bool = SchemaField(
description="Whether the output should be treated as a secret.",
default=False,
advanced=True,
)
class Output(BlockSchema):
output: Any = SchemaField(description="The value recorded as output.")
@ -241,7 +262,7 @@ class AgentOutputBlock(Block):
def __init__(self):
super().__init__(
id="363ae599-353e-4804-937e-b2ee3cef3da4",
description=("Stores the output of the graph for users to see."),
description="Stores the output of the graph for users to see.",
input_schema=AgentOutputBlock.Input,
output_schema=AgentOutputBlock.Output,
test_input=[
@ -271,6 +292,7 @@ class AgentOutputBlock(Block):
],
categories={BlockCategory.OUTPUT, BlockCategory.BASIC},
block_type=BlockType.OUTPUT,
static_output=True,
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
@ -280,9 +302,9 @@ class AgentOutputBlock(Block):
"""
if input_data.format:
try:
fmt = re.sub(r"(?<!{){[ a-zA-Z0-9_]+}", r"{\g<0>}", input_data.format)
template = jinja.from_string(fmt)
yield "output", template.render({input_data.name: input_data.value})
yield "output", formatter.format_string(
input_data.format, {input_data.name: input_data.value}
)
except Exception as e:
yield "output", f"Error: {e}, {input_data.value}"
else:
@ -291,16 +313,26 @@ class AgentOutputBlock(Block):
class AddToDictionaryBlock(Block):
class Input(BlockSchema):
dictionary: dict | None = SchemaField(
default=None,
dictionary: dict[Any, Any] = SchemaField(
default={},
description="The dictionary to add the entry to. If not provided, a new dictionary will be created.",
placeholder='{"key1": "value1", "key2": "value2"}',
)
key: str = SchemaField(
description="The key for the new entry.", placeholder="new_key"
default="",
description="The key for the new entry.",
placeholder="new_key",
advanced=False,
)
value: Any = SchemaField(
description="The value for the new entry.", placeholder="new_value"
default=None,
description="The value for the new entry.",
placeholder="new_value",
advanced=False,
)
entries: dict[Any, Any] = SchemaField(
default={},
description="The entries to add to the dictionary. This is the batch version of the `key` and `value` fields.",
advanced=True,
)
class Output(BlockSchema):
@ -323,6 +355,10 @@ class AddToDictionaryBlock(Block):
"value": "new_value",
},
{"key": "first_key", "value": "first_value"},
{
"dictionary": {"existing_key": "existing_value"},
"entries": {"new_key": "new_value", "first_key": "first_value"},
},
],
test_output=[
(
@ -330,38 +366,49 @@ class AddToDictionaryBlock(Block):
{"existing_key": "existing_value", "new_key": "new_value"},
),
("updated_dictionary", {"first_key": "first_value"}),
(
"updated_dictionary",
{
"existing_key": "existing_value",
"new_key": "new_value",
"first_key": "first_value",
},
),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
# If no dictionary is provided, create a new one
if input_data.dictionary is None:
updated_dict = {}
else:
# Create a copy of the input dictionary to avoid modifying the original
updated_dict = input_data.dictionary.copy()
updated_dict = input_data.dictionary.copy()
# Add the new key-value pair
updated_dict[input_data.key] = input_data.value
if input_data.value is not None and input_data.key:
updated_dict[input_data.key] = input_data.value
for key, value in input_data.entries.items():
updated_dict[key] = value
yield "updated_dictionary", updated_dict
class AddToListBlock(Block):
class Input(BlockSchema):
list: List[Any] | None = SchemaField(
default=None,
list: List[Any] = SchemaField(
default=[],
advanced=False,
description="The list to add the entry to. If not provided, a new list will be created.",
placeholder='[1, "string", {"key": "value"}]',
)
entry: Any = SchemaField(
description="The entry to add to the list. Can be of any type (string, int, dict, etc.).",
placeholder='{"new_key": "new_value"}',
advanced=False,
default=None,
)
entries: List[Any] = SchemaField(
default=[],
description="The entries to add to the list. This is the batch version of the `entry` field.",
advanced=True,
)
position: int | None = SchemaField(
default=None,
description="The position to insert the new entry. If not provided, the entry will be appended to the end of the list.",
placeholder="0",
)
class Output(BlockSchema):
@ -385,6 +432,12 @@ class AddToListBlock(Block):
},
{"entry": "first_entry"},
{"list": ["a", "b", "c"], "entry": "d"},
{
"entry": "e",
"entries": ["f", "g"],
"list": ["a", "b"],
"position": 1,
},
],
test_output=[
(
@ -398,22 +451,20 @@ class AddToListBlock(Block):
),
("updated_list", ["first_entry"]),
("updated_list", ["a", "b", "c", "d"]),
("updated_list", ["a", "f", "g", "e", "b"]),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
# If no list is provided, create a new one
if input_data.list is None:
updated_list = []
else:
# Create a copy of the input list to avoid modifying the original
updated_list = input_data.list.copy()
entries_added = input_data.entries.copy()
if input_data.entry:
entries_added.append(input_data.entry)
# Add the new entry
if input_data.position is None:
updated_list.append(input_data.entry)
updated_list = input_data.list.copy()
if (pos := input_data.position) is not None:
updated_list = updated_list[:pos] + entries_added + updated_list[pos:]
else:
updated_list.insert(input_data.position, input_data.entry)
updated_list += entries_added
yield "updated_list", updated_list
@ -441,3 +492,101 @@ class NoteBlock(Block):
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "output", input_data.text
class CreateDictionaryBlock(Block):
class Input(BlockSchema):
values: dict[str, Any] = SchemaField(
description="Key-value pairs to create the dictionary with",
placeholder="e.g., {'name': 'Alice', 'age': 25}",
)
class Output(BlockSchema):
dictionary: dict[str, Any] = SchemaField(
description="The created dictionary containing the specified key-value pairs"
)
error: str = SchemaField(
description="Error message if dictionary creation failed"
)
def __init__(self):
super().__init__(
id="b924ddf4-de4f-4b56-9a85-358930dcbc91",
description="Creates a dictionary with the specified key-value pairs. Use this when you know all the values you want to add upfront.",
categories={BlockCategory.DATA},
input_schema=CreateDictionaryBlock.Input,
output_schema=CreateDictionaryBlock.Output,
test_input=[
{
"values": {"name": "Alice", "age": 25, "city": "New York"},
},
{
"values": {"numbers": [1, 2, 3], "active": True, "score": 95.5},
},
],
test_output=[
(
"dictionary",
{"name": "Alice", "age": 25, "city": "New York"},
),
(
"dictionary",
{"numbers": [1, 2, 3], "active": True, "score": 95.5},
),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
try:
# The values are already validated by Pydantic schema
yield "dictionary", input_data.values
except Exception as e:
yield "error", f"Failed to create dictionary: {str(e)}"
class CreateListBlock(Block):
class Input(BlockSchema):
values: List[Any] = SchemaField(
description="A list of values to be combined into a new list.",
placeholder="e.g., ['Alice', 25, True]",
)
class Output(BlockSchema):
list: List[Any] = SchemaField(
description="The created list containing the specified values."
)
error: str = SchemaField(description="Error message if list creation failed.")
def __init__(self):
super().__init__(
id="a912d5c7-6e00-4542-b2a9-8034136930e4",
description="Creates a list with the specified values. Use this when you know all the values you want to add upfront.",
categories={BlockCategory.DATA},
input_schema=CreateListBlock.Input,
output_schema=CreateListBlock.Output,
test_input=[
{
"values": ["Alice", 25, True],
},
{
"values": [1, 2, 3, "four", {"key": "value"}],
},
],
test_output=[
(
"list",
["Alice", 25, True],
),
(
"list",
[1, 2, 3, "four", {"key": "value"}],
),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
try:
# The values are already validated by Pydantic schema
yield "list", input_data.values
except Exception as e:
yield "error", f"Failed to create list: {str(e)}"

View File

@ -71,11 +71,24 @@ class ConditionBlock(Block):
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
value1 = input_data.value1
operator = input_data.operator
value1 = input_data.value1
if isinstance(value1, str):
try:
value1 = float(value1.strip())
except ValueError:
value1 = value1.strip()
value2 = input_data.value2
if isinstance(value2, str):
try:
value2 = float(value2.strip())
except ValueError:
value2 = value2.strip()
yes_value = input_data.yes_value if input_data.yes_value is not None else value1
no_value = input_data.no_value if input_data.no_value is not None else value1
no_value = input_data.no_value if input_data.no_value is not None else value2
comparison_funcs = {
ComparisonOperator.EQUAL: lambda a, b: a == b,
@ -86,17 +99,11 @@ class ConditionBlock(Block):
ComparisonOperator.LESS_THAN_OR_EQUAL: lambda a, b: a <= b,
}
try:
result = comparison_funcs[operator](value1, value2)
result = comparison_funcs[operator](value1, value2)
yield "result", result
yield "result", result
if result:
yield "yes_output", yes_value
else:
yield "no_output", no_value
except Exception:
yield "result", None
yield "yes_output", None
yield "no_output", None
if result:
yield "yes_output", yes_value
else:
yield "no_output", no_value

View File

@ -0,0 +1,190 @@
from enum import Enum
from typing import Literal
from e2b_code_interpreter import Sandbox
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="e2b",
api_key=SecretStr("mock-e2b-api-key"),
title="Mock E2B API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.type,
}
class ProgrammingLanguage(Enum):
PYTHON = "python"
JAVASCRIPT = "js"
BASH = "bash"
R = "r"
JAVA = "java"
class CodeExecutionBlock(Block):
# TODO : Add support to upload and download files
# Currently, You can customized the CPU and Memory, only by creating a pre customized sandbox template
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.E2B], Literal["api_key"]
] = CredentialsField(
description="Enter your api key for the E2B Sandbox. You can get it in here - https://e2b.dev/docs",
)
# Todo : Option to run commond in background
setup_commands: list[str] = SchemaField(
description=(
"Shell commands to set up the sandbox before running the code. "
"You can use `curl` or `git` to install your desired Debian based "
"package manager. `pip` and `npm` are pre-installed.\n\n"
"These commands are executed with `sh`, in the foreground."
),
placeholder="pip install cowsay",
default=[],
advanced=False,
)
code: str = SchemaField(
description="Code to execute in the sandbox",
placeholder="print('Hello, World!')",
default="",
advanced=False,
)
language: ProgrammingLanguage = SchemaField(
description="Programming language to execute",
default=ProgrammingLanguage.PYTHON,
advanced=False,
)
timeout: int = SchemaField(
description="Execution timeout in seconds", default=300
)
template_id: str = SchemaField(
description=(
"You can use an E2B sandbox template by entering its ID here. "
"Check out the E2B docs for more details: "
"[E2B - Sandbox template](https://e2b.dev/docs/sandbox-template)"
),
default="",
advanced=True,
)
class Output(BlockSchema):
response: str = SchemaField(description="Response from code execution")
stdout_logs: str = SchemaField(
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
error: str = SchemaField(description="Error message if execution failed")
def __init__(self):
super().__init__(
id="0b02b072-abe7-11ef-8372-fb5d162dd712",
description="Executes code in an isolated sandbox environment with internet access.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=CodeExecutionBlock.Input,
output_schema=CodeExecutionBlock.Output,
test_credentials=TEST_CREDENTIALS,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"code": "print('Hello World')",
"language": ProgrammingLanguage.PYTHON.value,
"setup_commands": [],
"timeout": 300,
"template_id": "",
},
test_output=[
("response", "Hello World"),
("stdout_logs", "Hello World\n"),
],
test_mock={
"execute_code": lambda code, language, setup_commands, timeout, api_key, template_id: (
"Hello World",
"Hello World\n",
"",
),
},
)
def execute_code(
self,
code: str,
language: ProgrammingLanguage,
setup_commands: list[str],
timeout: int,
api_key: str,
template_id: str,
):
try:
sandbox = None
if template_id:
sandbox = Sandbox(
template=template_id, api_key=api_key, timeout=timeout
)
else:
sandbox = Sandbox(api_key=api_key, timeout=timeout)
if not sandbox:
raise Exception("Sandbox not created")
# Running setup commands
for cmd in setup_commands:
sandbox.commands.run(cmd)
# Executing the code
execution = sandbox.run_code(
code,
language=language.value,
on_error=lambda e: sandbox.kill(), # Kill the sandbox if there is an error
)
if execution.error:
raise Exception(execution.error)
response = execution.text
stdout_logs = "".join(execution.logs.stdout)
stderr_logs = "".join(execution.logs.stderr)
return response, stdout_logs, stderr_logs
except Exception as e:
raise e
def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
try:
response, stdout_logs, stderr_logs = self.execute_code(
input_data.code,
input_data.language,
input_data.setup_commands,
input_data.timeout,
credentials.api_key.get_secret_value(),
input_data.template_id,
)
if response:
yield "response", response
if stdout_logs:
yield "stdout_logs", stdout_logs
if stderr_logs:
yield "stderr_logs", stderr_logs
except Exception as e:
yield "error", str(e)

View File

@ -0,0 +1,110 @@
import re
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class CodeExtractionBlock(Block):
class Input(BlockSchema):
text: str = SchemaField(
description="Text containing code blocks to extract (e.g., AI response)",
placeholder="Enter text containing code blocks",
)
class Output(BlockSchema):
html: str = SchemaField(description="Extracted HTML code")
css: str = SchemaField(description="Extracted CSS code")
javascript: str = SchemaField(description="Extracted JavaScript code")
python: str = SchemaField(description="Extracted Python code")
sql: str = SchemaField(description="Extracted SQL code")
java: str = SchemaField(description="Extracted Java code")
cpp: str = SchemaField(description="Extracted C++ code")
csharp: str = SchemaField(description="Extracted C# code")
json_code: str = SchemaField(description="Extracted JSON code")
bash: str = SchemaField(description="Extracted Bash code")
php: str = SchemaField(description="Extracted PHP code")
ruby: str = SchemaField(description="Extracted Ruby code")
yaml: str = SchemaField(description="Extracted YAML code")
markdown: str = SchemaField(description="Extracted Markdown code")
typescript: str = SchemaField(description="Extracted TypeScript code")
xml: str = SchemaField(description="Extracted XML code")
remaining_text: str = SchemaField(
description="Remaining text after code extraction"
)
def __init__(self):
super().__init__(
id="d3a7d896-3b78-4f44-8b4b-48fbf4f0bcd8",
description="Extracts code blocks from text and identifies their programming languages",
categories={BlockCategory.TEXT},
input_schema=CodeExtractionBlock.Input,
output_schema=CodeExtractionBlock.Output,
test_input={
"text": "Here's a Python example:\n```python\nprint('Hello World')\n```\nAnd some HTML:\n```html\n<h1>Title</h1>\n```"
},
test_output=[
("html", "<h1>Title</h1>"),
("python", "print('Hello World')"),
("remaining_text", "Here's a Python example:\nAnd some HTML:"),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
# List of supported programming languages with mapped aliases
language_aliases = {
"html": ["html", "htm"],
"css": ["css"],
"javascript": ["javascript", "js"],
"python": ["python", "py"],
"sql": ["sql"],
"java": ["java"],
"cpp": ["cpp", "c++"],
"csharp": ["csharp", "c#", "cs"],
"json_code": ["json"],
"bash": ["bash", "shell", "sh"],
"php": ["php"],
"ruby": ["ruby", "rb"],
"yaml": ["yaml", "yml"],
"markdown": ["markdown", "md"],
"typescript": ["typescript", "ts"],
"xml": ["xml"],
}
# Extract code for each language
for canonical_name, aliases in language_aliases.items():
code = ""
# Try each alias for the language
for alias in aliases:
code_for_alias = self.extract_code(input_data.text, alias)
if code_for_alias:
code = code + "\n\n" + code_for_alias if code else code_for_alias
if code: # Only yield if there's actual code content
yield canonical_name, code
# Remove all code blocks from the text to get remaining text
pattern = (
r"```(?:"
+ "|".join(
re.escape(alias)
for aliases in language_aliases.values()
for alias in aliases
)
+ r")\s+[\s\S]*?```"
)
remaining_text = re.sub(pattern, "", input_data.text).strip()
remaining_text = re.sub(r"\n\s*\n", "\n", remaining_text)
if remaining_text: # Only yield if there's remaining text
yield "remaining_text", remaining_text
def extract_code(self, text: str, language: str) -> str:
# Escape special regex characters in the language string
language = re.escape(language)
# Extract all code blocks enclosed in ```language``` blocks
pattern = re.compile(rf"```{language}\s+(.*?)```", re.DOTALL | re.IGNORECASE)
matches = pattern.finditer(text)
# Combine all code blocks for this language with newlines between them
code_blocks = [match.group(1).strip() for match in matches]
return "\n\n".join(code_blocks) if code_blocks else ""

View File

@ -0,0 +1,59 @@
from pydantic import BaseModel
from backend.data.block import (
Block,
BlockCategory,
BlockManualWebhookConfig,
BlockOutput,
BlockSchema,
)
from backend.data.model import SchemaField
from backend.integrations.webhooks.compass import CompassWebhookType
class Transcription(BaseModel):
text: str
speaker: str
end: float
start: float
duration: float
class TranscriptionDataModel(BaseModel):
date: str
transcription: str
transcriptions: list[Transcription]
class CompassAITriggerBlock(Block):
class Input(BlockSchema):
payload: TranscriptionDataModel = SchemaField(hidden=True)
class Output(BlockSchema):
transcription: str = SchemaField(
description="The contents of the compass transcription."
)
def __init__(self):
super().__init__(
id="9464a020-ed1d-49e1-990f-7f2ac924a2b7",
description="This block will output the contents of the compass transcription.",
categories={BlockCategory.HARDWARE},
input_schema=CompassAITriggerBlock.Input,
output_schema=CompassAITriggerBlock.Output,
webhook_config=BlockManualWebhookConfig(
provider="compass",
webhook_type=CompassWebhookType.TRANSCRIPTION,
),
test_input=[
{"input": "Hello, World!"},
{"input": "Hello, World!", "data": "Existing Data"},
],
# test_output=[
# ("output", "Hello, World!"), # No data provided, so trigger is returned
# ("output", "Existing Data"), # Data is provided, so data is returned.
# ],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "transcription", input_data.payload.transcription

View File

@ -0,0 +1,43 @@
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class WordCharacterCountBlock(Block):
class Input(BlockSchema):
text: str = SchemaField(
description="Input text to count words and characters",
placeholder="Enter your text here",
advanced=False,
)
class Output(BlockSchema):
word_count: int = SchemaField(description="Number of words in the input text")
character_count: int = SchemaField(
description="Number of characters in the input text"
)
error: str = SchemaField(
description="Error message if the counting operation failed"
)
def __init__(self):
super().__init__(
id="ab2a782d-22cf-4587-8a70-55b59b3f9f90",
description="Counts the number of words and characters in a given text.",
categories={BlockCategory.TEXT},
input_schema=WordCharacterCountBlock.Input,
output_schema=WordCharacterCountBlock.Output,
test_input={"text": "Hello, how are you?"},
test_output=[("word_count", 4), ("character_count", 19)],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
try:
text = input_data.text
word_count = len(text.split())
character_count = len(text)
yield "word_count", word_count
yield "character_count", character_count
except Exception as e:
yield "error", str(e)

View File

@ -3,21 +3,24 @@ from typing import Literal
import aiohttp
import discord
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
DiscordCredentials = CredentialsMetaInput[Literal["discord"], Literal["api_key"]]
DiscordCredentials = CredentialsMetaInput[
Literal[ProviderName.DISCORD], Literal["api_key"]
]
def DiscordCredentialsField() -> DiscordCredentials:
return CredentialsField(
description="Discord bot token",
provider="discord",
supported_credential_types={"api_key"},
)
return CredentialsField(description="Discord bot token")
TEST_CREDENTIALS = APIKeyCredentials(

View File

@ -0,0 +1,32 @@
from typing import Literal
from pydantic import SecretStr
from backend.data.model import APIKeyCredentials, CredentialsField, CredentialsMetaInput
from backend.integrations.providers import ProviderName
ExaCredentials = APIKeyCredentials
ExaCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.EXA],
Literal["api_key"],
]
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="exa",
api_key=SecretStr("mock-exa-api-key"),
title="Mock Exa API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
def ExaCredentialsField() -> ExaCredentialsInput:
"""Creates an Exa credentials input on a block."""
return CredentialsField(description="The Exa integration requires an API Key.")

View File

@ -0,0 +1,87 @@
from typing import List, Optional
from pydantic import BaseModel
from backend.blocks.exa._auth import (
ExaCredentials,
ExaCredentialsField,
ExaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class ContentRetrievalSettings(BaseModel):
text: dict = SchemaField(
description="Text content settings",
default={"maxCharacters": 1000, "includeHtmlTags": False},
advanced=True,
)
highlights: dict = SchemaField(
description="Highlight settings",
default={
"numSentences": 3,
"highlightsPerUrl": 3,
"query": "",
},
advanced=True,
)
summary: dict = SchemaField(
description="Summary settings",
default={"query": ""},
advanced=True,
)
class ExaContentsBlock(Block):
class Input(BlockSchema):
credentials: ExaCredentialsInput = ExaCredentialsField()
ids: List[str] = SchemaField(
description="Array of document IDs obtained from searches",
)
contents: ContentRetrievalSettings = SchemaField(
description="Content retrieval settings",
default=ContentRetrievalSettings(),
advanced=True,
)
class Output(BlockSchema):
results: list = SchemaField(
description="List of document contents",
default=[],
)
def __init__(self):
super().__init__(
id="c52be83f-f8cd-4180-b243-af35f986b461",
description="Retrieves document contents using Exa's contents API",
categories={BlockCategory.SEARCH},
input_schema=ExaContentsBlock.Input,
output_schema=ExaContentsBlock.Output,
)
def run(
self, input_data: Input, *, credentials: ExaCredentials, **kwargs
) -> BlockOutput:
url = "https://api.exa.ai/contents"
headers = {
"Content-Type": "application/json",
"x-api-key": credentials.api_key.get_secret_value(),
}
payload = {
"ids": input_data.ids,
"text": input_data.contents.text,
"highlights": input_data.contents.highlights,
"summary": input_data.contents.summary,
}
try:
response = requests.post(url, headers=headers, json=payload)
response.raise_for_status()
data = response.json()
yield "results", data.get("results", [])
except Exception as e:
yield "error", str(e)
yield "results", []

View File

@ -0,0 +1,54 @@
from typing import Optional
from pydantic import BaseModel
from backend.data.model import SchemaField
class TextSettings(BaseModel):
max_characters: int = SchemaField(
default=1000,
description="Maximum number of characters to return",
placeholder="1000",
)
include_html_tags: bool = SchemaField(
default=False,
description="Whether to include HTML tags in the text",
placeholder="False",
)
class HighlightSettings(BaseModel):
num_sentences: int = SchemaField(
default=3,
description="Number of sentences per highlight",
placeholder="3",
)
highlights_per_url: int = SchemaField(
default=3,
description="Number of highlights per URL",
placeholder="3",
)
class SummarySettings(BaseModel):
query: Optional[str] = SchemaField(
default="",
description="Query string for summarization",
placeholder="Enter query",
)
class ContentSettings(BaseModel):
text: TextSettings = SchemaField(
default=TextSettings(),
description="Text content settings",
)
highlights: HighlightSettings = SchemaField(
default=HighlightSettings(),
description="Highlight settings",
)
summary: SummarySettings = SchemaField(
default=SummarySettings(),
description="Summary settings",
)

View File

@ -0,0 +1,143 @@
from datetime import datetime
from typing import List
from backend.blocks.exa._auth import (
ExaCredentials,
ExaCredentialsField,
ExaCredentialsInput,
)
from backend.blocks.exa.helpers import ContentSettings
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class ExaSearchBlock(Block):
class Input(BlockSchema):
credentials: ExaCredentialsInput = ExaCredentialsField()
query: str = SchemaField(description="The search query")
use_auto_prompt: bool = SchemaField(
description="Whether to use autoprompt",
default=True,
advanced=True,
)
type: str = SchemaField(
description="Type of search",
default="",
advanced=True,
)
category: str = SchemaField(
description="Category to search within",
default="",
advanced=True,
)
number_of_results: int = SchemaField(
description="Number of results to return",
default=10,
advanced=True,
)
include_domains: List[str] = SchemaField(
description="Domains to include in search",
default=[],
)
exclude_domains: List[str] = SchemaField(
description="Domains to exclude from search",
default=[],
advanced=True,
)
start_crawl_date: datetime = SchemaField(
description="Start date for crawled content",
)
end_crawl_date: datetime = SchemaField(
description="End date for crawled content",
)
start_published_date: datetime = SchemaField(
description="Start date for published content",
)
end_published_date: datetime = SchemaField(
description="End date for published content",
)
include_text: List[str] = SchemaField(
description="Text patterns to include",
default=[],
advanced=True,
)
exclude_text: List[str] = SchemaField(
description="Text patterns to exclude",
default=[],
advanced=True,
)
contents: ContentSettings = SchemaField(
description="Content retrieval settings",
default=ContentSettings(),
advanced=True,
)
class Output(BlockSchema):
results: list = SchemaField(
description="List of search results",
default=[],
)
def __init__(self):
super().__init__(
id="996cec64-ac40-4dde-982f-b0dc60a5824d",
description="Searches the web using Exa's advanced search API",
categories={BlockCategory.SEARCH},
input_schema=ExaSearchBlock.Input,
output_schema=ExaSearchBlock.Output,
)
def run(
self, input_data: Input, *, credentials: ExaCredentials, **kwargs
) -> BlockOutput:
url = "https://api.exa.ai/search"
headers = {
"Content-Type": "application/json",
"x-api-key": credentials.api_key.get_secret_value(),
}
payload = {
"query": input_data.query,
"useAutoprompt": input_data.use_auto_prompt,
"numResults": input_data.number_of_results,
"contents": input_data.contents.dict(),
}
date_field_mapping = {
"start_crawl_date": "startCrawlDate",
"end_crawl_date": "endCrawlDate",
"start_published_date": "startPublishedDate",
"end_published_date": "endPublishedDate",
}
# Add dates if they exist
for input_field, api_field in date_field_mapping.items():
value = getattr(input_data, input_field, None)
if value:
payload[api_field] = value.strftime("%Y-%m-%dT%H:%M:%S.000Z")
optional_field_mapping = {
"type": "type",
"category": "category",
"include_domains": "includeDomains",
"exclude_domains": "excludeDomains",
"include_text": "includeText",
"exclude_text": "excludeText",
}
# Add other fields
for input_field, api_field in optional_field_mapping.items():
value = getattr(input_data, input_field)
if value: # Only add non-empty values
payload[api_field] = value
try:
response = requests.post(url, headers=headers, json=payload)
response.raise_for_status()
data = response.json()
# Extract just the results array from the response
yield "results", data.get("results", [])
except Exception as e:
yield "error", str(e)
yield "results", []

View File

@ -0,0 +1,128 @@
from datetime import datetime
from typing import Any, List
from backend.blocks.exa._auth import (
ExaCredentials,
ExaCredentialsField,
ExaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
from .helpers import ContentSettings
class ExaFindSimilarBlock(Block):
class Input(BlockSchema):
credentials: ExaCredentialsInput = ExaCredentialsField()
url: str = SchemaField(
description="The url for which you would like to find similar links"
)
number_of_results: int = SchemaField(
description="Number of results to return",
default=10,
advanced=True,
)
include_domains: List[str] = SchemaField(
description="Domains to include in search",
default=[],
advanced=True,
)
exclude_domains: List[str] = SchemaField(
description="Domains to exclude from search",
default=[],
advanced=True,
)
start_crawl_date: datetime = SchemaField(
description="Start date for crawled content",
)
end_crawl_date: datetime = SchemaField(
description="End date for crawled content",
)
start_published_date: datetime = SchemaField(
description="Start date for published content",
)
end_published_date: datetime = SchemaField(
description="End date for published content",
)
include_text: List[str] = SchemaField(
description="Text patterns to include (max 1 string, up to 5 words)",
default=[],
advanced=True,
)
exclude_text: List[str] = SchemaField(
description="Text patterns to exclude (max 1 string, up to 5 words)",
default=[],
advanced=True,
)
contents: ContentSettings = SchemaField(
description="Content retrieval settings",
default=ContentSettings(),
advanced=True,
)
class Output(BlockSchema):
results: List[Any] = SchemaField(
description="List of similar documents with title, URL, published date, author, and score",
default=[],
)
def __init__(self):
super().__init__(
id="5e7315d1-af61-4a0c-9350-7c868fa7438a",
description="Finds similar links using Exa's findSimilar API",
categories={BlockCategory.SEARCH},
input_schema=ExaFindSimilarBlock.Input,
output_schema=ExaFindSimilarBlock.Output,
)
def run(
self, input_data: Input, *, credentials: ExaCredentials, **kwargs
) -> BlockOutput:
url = "https://api.exa.ai/findSimilar"
headers = {
"Content-Type": "application/json",
"x-api-key": credentials.api_key.get_secret_value(),
}
payload = {
"url": input_data.url,
"numResults": input_data.number_of_results,
"contents": input_data.contents.dict(),
}
optional_field_mapping = {
"include_domains": "includeDomains",
"exclude_domains": "excludeDomains",
"include_text": "includeText",
"exclude_text": "excludeText",
}
# Add optional fields if they have values
for input_field, api_field in optional_field_mapping.items():
value = getattr(input_data, input_field)
if value: # Only add non-empty values
payload[api_field] = value
date_field_mapping = {
"start_crawl_date": "startCrawlDate",
"end_crawl_date": "endCrawlDate",
"start_published_date": "startPublishedDate",
"end_published_date": "endPublishedDate",
}
# Add dates if they exist
for input_field, api_field in date_field_mapping.items():
value = getattr(input_data, input_field, None)
if value:
payload[api_field] = value.strftime("%Y-%m-%dT%H:%M:%S.000Z")
try:
response = requests.post(url, headers=headers, json=payload)
response.raise_for_status()
data = response.json()
yield "results", data.get("results", [])
except Exception as e:
yield "error", str(e)
yield "results", []

View File

@ -0,0 +1,35 @@
from typing import Literal
from pydantic import SecretStr
from backend.data.model import APIKeyCredentials, CredentialsField, CredentialsMetaInput
from backend.integrations.providers import ProviderName
FalCredentials = APIKeyCredentials
FalCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.FAL],
Literal["api_key"],
]
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="fal",
api_key=SecretStr("mock-fal-api-key"),
title="Mock FAL API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
def FalCredentialsField() -> FalCredentialsInput:
"""
Creates a FAL credentials input on a block.
"""
return CredentialsField(
description="The FAL integration can be used with an API Key.",
)

View File

@ -0,0 +1,199 @@
import logging
import time
from enum import Enum
from typing import Any
import httpx
from backend.blocks.fal._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
FalCredentials,
FalCredentialsField,
FalCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
logger = logging.getLogger(__name__)
class FalModel(str, Enum):
MOCHI = "fal-ai/mochi-v1"
LUMA = "fal-ai/luma-dream-machine"
class AIVideoGeneratorBlock(Block):
class Input(BlockSchema):
prompt: str = SchemaField(
description="Description of the video to generate.",
placeholder="A dog running in a field.",
)
model: FalModel = SchemaField(
title="FAL Model",
default=FalModel.MOCHI,
description="The FAL model to use for video generation.",
)
credentials: FalCredentialsInput = FalCredentialsField()
class Output(BlockSchema):
video_url: str = SchemaField(description="The URL of the generated video.")
error: str = SchemaField(
description="Error message if video generation failed."
)
logs: list[str] = SchemaField(
description="Generation progress logs.", optional=True
)
def __init__(self):
super().__init__(
id="530cf046-2ce0-4854-ae2c-659db17c7a46",
description="Generate videos using FAL AI models.",
categories={BlockCategory.AI},
input_schema=self.Input,
output_schema=self.Output,
test_input={
"prompt": "A dog running in a field.",
"model": FalModel.MOCHI,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("video_url", "https://fal.media/files/example/video.mp4")],
test_mock={
"generate_video": lambda *args, **kwargs: "https://fal.media/files/example/video.mp4"
},
)
def _get_headers(self, api_key: str) -> dict[str, str]:
"""Get headers for FAL API requests."""
return {
"Authorization": f"Key {api_key}",
"Content-Type": "application/json",
}
def _submit_request(
self, url: str, headers: dict[str, str], data: dict[str, Any]
) -> dict[str, Any]:
"""Submit a request to the FAL API."""
try:
response = httpx.post(url, headers=headers, json=data)
response.raise_for_status()
return response.json()
except httpx.HTTPError as e:
logger.error(f"FAL API request failed: {str(e)}")
raise RuntimeError(f"Failed to submit request: {str(e)}")
def _poll_status(self, status_url: str, headers: dict[str, str]) -> dict[str, Any]:
"""Poll the status endpoint until completion or failure."""
try:
response = httpx.get(status_url, headers=headers)
response.raise_for_status()
return response.json()
except httpx.HTTPError as e:
logger.error(f"Failed to get status: {str(e)}")
raise RuntimeError(f"Failed to get status: {str(e)}")
def generate_video(self, input_data: Input, credentials: FalCredentials) -> str:
"""Generate video using the specified FAL model."""
base_url = "https://queue.fal.run"
api_key = credentials.api_key.get_secret_value()
headers = self._get_headers(api_key)
# Submit generation request
submit_url = f"{base_url}/{input_data.model.value}"
submit_data = {"prompt": input_data.prompt}
seen_logs = set()
try:
# Submit request to queue
submit_response = httpx.post(submit_url, headers=headers, json=submit_data)
submit_response.raise_for_status()
request_data = submit_response.json()
# Get request_id and urls from initial response
request_id = request_data.get("request_id")
status_url = request_data.get("status_url")
result_url = request_data.get("response_url")
if not all([request_id, status_url, result_url]):
raise ValueError("Missing required data in submission response")
# Poll for status with exponential backoff
max_attempts = 30
attempt = 0
base_wait_time = 5
while attempt < max_attempts:
status_response = httpx.get(f"{status_url}?logs=1", headers=headers)
status_response.raise_for_status()
status_data = status_response.json()
# Process new logs only
logs = status_data.get("logs", [])
if logs and isinstance(logs, list):
for log in logs:
if isinstance(log, dict):
# Create a unique key for this log entry
log_key = (
f"{log.get('timestamp', '')}-{log.get('message', '')}"
)
if log_key not in seen_logs:
seen_logs.add(log_key)
message = log.get("message", "")
if message:
logger.debug(
f"[FAL Generation] [{log.get('level', 'INFO')}] [{log.get('source', '')}] [{log.get('timestamp', '')}] {message}"
)
status = status_data.get("status")
if status == "COMPLETED":
# Get the final result
result_response = httpx.get(result_url, headers=headers)
result_response.raise_for_status()
result_data = result_response.json()
if "video" not in result_data or not isinstance(
result_data["video"], dict
):
raise ValueError("Invalid response format - missing video data")
video_url = result_data["video"].get("url")
if not video_url:
raise ValueError("No video URL in response")
return video_url
elif status == "FAILED":
error_msg = status_data.get("error", "No error details provided")
raise RuntimeError(f"Video generation failed: {error_msg}")
elif status == "IN_QUEUE":
position = status_data.get("queue_position", "unknown")
logger.debug(
f"[FAL Generation] Status: In queue, position: {position}"
)
elif status == "IN_PROGRESS":
logger.debug(
"[FAL Generation] Status: Request is being processed..."
)
else:
logger.info(f"[FAL Generation] Status: Unknown status: {status}")
wait_time = min(base_wait_time * (2**attempt), 60) # Cap at 60 seconds
time.sleep(wait_time)
attempt += 1
raise RuntimeError("Maximum polling attempts reached")
except httpx.HTTPError as e:
raise RuntimeError(f"API request failed: {str(e)}")
def run(
self, input_data: Input, *, credentials: FalCredentials, **kwargs
) -> BlockOutput:
try:
video_url = self.generate_video(input_data, credentials)
yield "video_url", video_url
except Exception as e:
error_message = str(e)
yield "error", error_message

View File

@ -0,0 +1,43 @@
from urllib.parse import urlparse
from backend.blocks.github._auth import GithubCredentials
from backend.util.request import Requests
def _convert_to_api_url(url: str) -> str:
"""
Converts a standard GitHub URL to the corresponding GitHub API URL.
Handles repository URLs, issue URLs, pull request URLs, and more.
"""
parsed_url = urlparse(url)
path_parts = parsed_url.path.strip("/").split("/")
if len(path_parts) >= 2:
owner, repo = path_parts[0], path_parts[1]
api_base = f"https://api.github.com/repos/{owner}/{repo}"
if len(path_parts) > 2:
additional_path = "/".join(path_parts[2:])
api_url = f"{api_base}/{additional_path}"
else:
# Repository base URL
api_url = api_base
else:
raise ValueError("Invalid GitHub URL format.")
return api_url
def _get_headers(credentials: GithubCredentials) -> dict[str, str]:
return {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
def get_api(credentials: GithubCredentials, convert_urls: bool = True) -> Requests:
return Requests(
trusted_origins=["https://api.github.com", "https://github.com"],
extra_url_validator=_convert_to_api_url if convert_urls else None,
extra_headers=_get_headers(credentials),
)

View File

@ -1,12 +1,14 @@
from typing import Literal
from autogpt_libs.supabase_integration_credentials_store.types import (
APIKeyCredentials,
OAuth2Credentials,
)
from pydantic import SecretStr
from backend.data.model import CredentialsField, CredentialsMetaInput
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
OAuth2Credentials,
)
from backend.integrations.providers import ProviderName
from backend.util.settings import Secrets
secrets = Secrets()
@ -16,7 +18,7 @@ GITHUB_OAUTH_IS_CONFIGURED = bool(
GithubCredentials = APIKeyCredentials | OAuth2Credentials
GithubCredentialsInput = CredentialsMetaInput[
Literal["github"],
Literal[ProviderName.GITHUB],
Literal["api_key", "oauth2"] if GITHUB_OAUTH_IS_CONFIGURED else Literal["api_key"],
]
@ -29,10 +31,6 @@ def GithubCredentialsField(scope: str) -> GithubCredentialsInput:
scope: The authorization scope needed for the block to work. ([list of available scopes](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/scopes-for-oauth-apps#available-scopes))
""" # noqa
return CredentialsField(
provider="github",
supported_credential_types=(
{"api_key", "oauth2"} if GITHUB_OAUTH_IS_CONFIGURED else {"api_key"}
),
required_scopes={scope},
description="The GitHub integration can be used with OAuth, "
"or any API key with sufficient permissions for the blocks it is used on.",

View File

@ -0,0 +1,700 @@
{
"action": "synchronize",
"number": 8358,
"pull_request": {
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/8358",
"id": 2128918491,
"node_id": "PR_kwDOJKSTjM5-5Lfb",
"html_url": "https://github.com/Significant-Gravitas/AutoGPT/pull/8358",
"diff_url": "https://github.com/Significant-Gravitas/AutoGPT/pull/8358.diff",
"patch_url": "https://github.com/Significant-Gravitas/AutoGPT/pull/8358.patch",
"issue_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/8358",
"number": 8358,
"state": "open",
"locked": false,
"title": "feat(platform, blocks): Webhook-triggered blocks",
"user": {
"login": "Pwuts",
"id": 12185583,
"node_id": "MDQ6VXNlcjEyMTg1NTgz",
"avatar_url": "https://avatars.githubusercontent.com/u/12185583?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Pwuts",
"html_url": "https://github.com/Pwuts",
"followers_url": "https://api.github.com/users/Pwuts/followers",
"following_url": "https://api.github.com/users/Pwuts/following{/other_user}",
"gists_url": "https://api.github.com/users/Pwuts/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Pwuts/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Pwuts/subscriptions",
"organizations_url": "https://api.github.com/users/Pwuts/orgs",
"repos_url": "https://api.github.com/users/Pwuts/repos",
"events_url": "https://api.github.com/users/Pwuts/events{/privacy}",
"received_events_url": "https://api.github.com/users/Pwuts/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
},
"body": "- Resolves #8352\r\n\r\n## Changes 🏗️\r\n\r\n- feat(blocks): Add GitHub Pull Request Trigger block\r\n\r\n### feat(platform): Add support for Webhook-triggered blocks\r\n- ⚠️ Add `PLATFORM_BASE_URL` setting\r\n\r\n- Add webhook config option and `BlockType.WEBHOOK` to `Block`\r\n - Add check to `Block.__init__` to enforce type and shape of webhook event filter\r\n - Add check to `Block.__init__` to enforce `payload` input on webhook blocks\r\n\r\n- Add `Webhook` model + CRUD functions in `backend.data.integrations` to represent webhooks created by our system\r\n - Add `IntegrationWebhook` to DB schema + reference `AgentGraphNode.webhook_id`\r\n - Add `set_node_webhook(..)` in `backend.data.graph`\r\n\r\n- Add webhook-related endpoints:\r\n - `POST /integrations/{provider}/webhooks/{webhook_id}/ingress` endpoint, to receive webhook payloads, and for all associated nodes create graph executions\r\n - Add `Node.is_triggered_by_event_type(..)` helper method\r\n - `POST /integrations/{provider}/webhooks/{webhook_id}/ping` endpoint, to allow testing a webhook\r\n - Add `WebhookEvent` + pub/sub functions in `backend.data.integrations`\r\n\r\n- Add `backend.integrations.webhooks` module, including:\r\n - `graph_lifecycle_hooks`, e.g. `on_graph_activate(..)`, to handle corresponding webhook creation etc.\r\n - Add calls to these hooks in the graph create/update endpoints\r\n - `BaseWebhooksManager` + `GithubWebhooksManager` to handle creating + registering, removing + deregistering, and retrieving existing webhooks, and validating incoming payloads\r\n\r\n### Other improvements\r\n- fix(blocks): Allow having an input and output pin with the same name\r\n- feat(blocks): Allow hiding inputs (e.g. `payload`) with `SchemaField(hidden=True)`\r\n- feat(backend/data): Add `graph_id`, `graph_version` to `Node`; `user_id` to `GraphMeta`\r\n - Add `Creatable` versions of `Node`, `GraphMeta` and `Graph` without these properties\r\n - Add `graph_from_creatable(..)` helper function in `backend.data.graph`\r\n- refactor(backend/data): Make `RedisEventQueue` generic\r\n- refactor(frontend): Deduplicate & clean up code for different block types in `generateInputHandles(..)` in `CustomNode`\r\n- refactor(backend): Remove unused subgraph functionality\r\n\r\n## How it works\r\n- When a graph is created, the `on_graph_activate` and `on_node_activate` hooks are called on the graph and its nodes\r\n- If a webhook-triggered node has presets for all the relevant inputs, `on_node_activate` will get/create a suitable webhook and link it by setting `AgentGraphNode.webhook_id`\r\n - `on_node_activate` uses `webhook_manager.get_suitable_webhook(..)`, which tries to find a suitable webhook (with matching requirements) or creates it if none exists yet\r\n- When a graph is deactivated (in favor of a newer/other version) or deleted, `on_graph_deactivate` and `on_node_deactivate` are called on the graph and its nodes to clean up webhooks that are no longer in use\r\n- When a valid webhook payload is received, two things happen:\r\n 1. It is broadcast on the Redis channel `webhooks/{webhook_id}/{event_type}`\r\n 2. Graph executions are initiated for all nodes triggered by this webhook\r\n\r\n## TODO\r\n- [ ] #8537\r\n- [x] #8538\r\n- [ ] #8357\r\n- [ ] ~~#8554~~ can be done in a follow-up PR\r\n- [ ] Test test test!\r\n- [ ] Add note on `repo` input of webhook blocks that the credentials used must have the right permissions for the given organization/repo\r\n- [x] Implement proper detection and graceful handling of webhook creation failing due to insufficient permissions. This should give a clear message to the user to e.g. \"give the app access to this organization in your settings\".\r\n- [ ] Nice-to-have: make a button on webhook blocks to trigger a ping and check its result. The API endpoints for this is already implemented.",
"created_at": "2024-10-16T22:13:47Z",
"updated_at": "2024-11-11T18:34:54Z",
"closed_at": null,
"merged_at": null,
"merge_commit_sha": "cbfd0cdd8db52cdd5a3b7ce088fc0ab4617a652e",
"assignee": {
"login": "Pwuts",
"id": 12185583,
"node_id": "MDQ6VXNlcjEyMTg1NTgz",
"avatar_url": "https://avatars.githubusercontent.com/u/12185583?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Pwuts",
"html_url": "https://github.com/Pwuts",
"followers_url": "https://api.github.com/users/Pwuts/followers",
"following_url": "https://api.github.com/users/Pwuts/following{/other_user}",
"gists_url": "https://api.github.com/users/Pwuts/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Pwuts/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Pwuts/subscriptions",
"organizations_url": "https://api.github.com/users/Pwuts/orgs",
"repos_url": "https://api.github.com/users/Pwuts/repos",
"events_url": "https://api.github.com/users/Pwuts/events{/privacy}",
"received_events_url": "https://api.github.com/users/Pwuts/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
},
"assignees": [
{
"login": "Pwuts",
"id": 12185583,
"node_id": "MDQ6VXNlcjEyMTg1NTgz",
"avatar_url": "https://avatars.githubusercontent.com/u/12185583?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Pwuts",
"html_url": "https://github.com/Pwuts",
"followers_url": "https://api.github.com/users/Pwuts/followers",
"following_url": "https://api.github.com/users/Pwuts/following{/other_user}",
"gists_url": "https://api.github.com/users/Pwuts/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Pwuts/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Pwuts/subscriptions",
"organizations_url": "https://api.github.com/users/Pwuts/orgs",
"repos_url": "https://api.github.com/users/Pwuts/repos",
"events_url": "https://api.github.com/users/Pwuts/events{/privacy}",
"received_events_url": "https://api.github.com/users/Pwuts/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
],
"requested_reviewers": [
{
"login": "kcze",
"id": 34861343,
"node_id": "MDQ6VXNlcjM0ODYxMzQz",
"avatar_url": "https://avatars.githubusercontent.com/u/34861343?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kcze",
"html_url": "https://github.com/kcze",
"followers_url": "https://api.github.com/users/kcze/followers",
"following_url": "https://api.github.com/users/kcze/following{/other_user}",
"gists_url": "https://api.github.com/users/kcze/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kcze/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kcze/subscriptions",
"organizations_url": "https://api.github.com/users/kcze/orgs",
"repos_url": "https://api.github.com/users/kcze/repos",
"events_url": "https://api.github.com/users/kcze/events{/privacy}",
"received_events_url": "https://api.github.com/users/kcze/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
],
"requested_teams": [
{
"name": "DevOps",
"id": 9547361,
"node_id": "T_kwDOB8roIc4Aka5h",
"slug": "devops",
"description": "",
"privacy": "closed",
"notification_setting": "notifications_enabled",
"url": "https://api.github.com/organizations/130738209/team/9547361",
"html_url": "https://github.com/orgs/Significant-Gravitas/teams/devops",
"members_url": "https://api.github.com/organizations/130738209/team/9547361/members{/member}",
"repositories_url": "https://api.github.com/organizations/130738209/team/9547361/repos",
"permission": "pull",
"parent": null
}
],
"labels": [
{
"id": 5272676214,
"node_id": "LA_kwDOJKSTjM8AAAABOkandg",
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
},
{
"id": 5410633769,
"node_id": "LA_kwDOJKSTjM8AAAABQn-4KQ",
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels/size/xl",
"name": "size/xl",
"color": "E751DD",
"default": false,
"description": ""
},
{
"id": 6892322271,
"node_id": "LA_kwDOJKSTjM8AAAABmtB93w",
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels/Review%20effort%20[1-5]:%204",
"name": "Review effort [1-5]: 4",
"color": "d1bcf9",
"default": false,
"description": null
},
{
"id": 7218433025,
"node_id": "LA_kwDOJKSTjM8AAAABrkCMAQ",
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels/platform/frontend",
"name": "platform/frontend",
"color": "033C07",
"default": false,
"description": "AutoGPT Platform - Front end"
},
{
"id": 7219356193,
"node_id": "LA_kwDOJKSTjM8AAAABrk6iIQ",
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels/platform/backend",
"name": "platform/backend",
"color": "ededed",
"default": false,
"description": "AutoGPT Platform - Back end"
},
{
"id": 7515330106,
"node_id": "LA_kwDOJKSTjM8AAAABv_LWOg",
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels/platform/blocks",
"name": "platform/blocks",
"color": "eb5757",
"default": false,
"description": null
}
],
"milestone": null,
"draft": false,
"commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/8358/commits",
"review_comments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/8358/comments",
"review_comment_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/comments{/number}",
"comments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/8358/comments",
"statuses_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/statuses/8f708a2b60463eec10747d8f45dead35b5a45bd0",
"head": {
"label": "Significant-Gravitas:reinier/open-1961-implement-github-on-pull-request-block",
"ref": "reinier/open-1961-implement-github-on-pull-request-block",
"sha": "8f708a2b60463eec10747d8f45dead35b5a45bd0",
"user": {
"login": "Significant-Gravitas",
"id": 130738209,
"node_id": "O_kgDOB8roIQ",
"avatar_url": "https://avatars.githubusercontent.com/u/130738209?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Significant-Gravitas",
"html_url": "https://github.com/Significant-Gravitas",
"followers_url": "https://api.github.com/users/Significant-Gravitas/followers",
"following_url": "https://api.github.com/users/Significant-Gravitas/following{/other_user}",
"gists_url": "https://api.github.com/users/Significant-Gravitas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Significant-Gravitas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Significant-Gravitas/subscriptions",
"organizations_url": "https://api.github.com/users/Significant-Gravitas/orgs",
"repos_url": "https://api.github.com/users/Significant-Gravitas/repos",
"events_url": "https://api.github.com/users/Significant-Gravitas/events{/privacy}",
"received_events_url": "https://api.github.com/users/Significant-Gravitas/received_events",
"type": "Organization",
"user_view_type": "public",
"site_admin": false
},
"repo": {
"id": 614765452,
"node_id": "R_kgDOJKSTjA",
"name": "AutoGPT",
"full_name": "Significant-Gravitas/AutoGPT",
"private": false,
"owner": {
"login": "Significant-Gravitas",
"id": 130738209,
"node_id": "O_kgDOB8roIQ",
"avatar_url": "https://avatars.githubusercontent.com/u/130738209?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Significant-Gravitas",
"html_url": "https://github.com/Significant-Gravitas",
"followers_url": "https://api.github.com/users/Significant-Gravitas/followers",
"following_url": "https://api.github.com/users/Significant-Gravitas/following{/other_user}",
"gists_url": "https://api.github.com/users/Significant-Gravitas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Significant-Gravitas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Significant-Gravitas/subscriptions",
"organizations_url": "https://api.github.com/users/Significant-Gravitas/orgs",
"repos_url": "https://api.github.com/users/Significant-Gravitas/repos",
"events_url": "https://api.github.com/users/Significant-Gravitas/events{/privacy}",
"received_events_url": "https://api.github.com/users/Significant-Gravitas/received_events",
"type": "Organization",
"user_view_type": "public",
"site_admin": false
},
"html_url": "https://github.com/Significant-Gravitas/AutoGPT",
"description": "AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.",
"fork": false,
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT",
"forks_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/forks",
"keys_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/keys{/key_id}",
"collaborators_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/collaborators{/collaborator}",
"teams_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/teams",
"hooks_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/hooks",
"issue_events_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/events{/number}",
"events_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/events",
"assignees_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/assignees{/user}",
"branches_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/branches{/branch}",
"tags_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/tags",
"blobs_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/blobs{/sha}",
"git_tags_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/tags{/sha}",
"git_refs_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/refs{/sha}",
"trees_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/trees{/sha}",
"statuses_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/statuses/{sha}",
"languages_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/languages",
"stargazers_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/stargazers",
"contributors_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/contributors",
"subscribers_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/subscribers",
"subscription_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/subscription",
"commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/commits{/sha}",
"git_commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/commits{/sha}",
"comments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/comments{/number}",
"issue_comment_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/comments{/number}",
"contents_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/contents/{+path}",
"compare_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/compare/{base}...{head}",
"merges_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/merges",
"archive_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/{archive_format}{/ref}",
"downloads_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/downloads",
"issues_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues{/number}",
"pulls_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls{/number}",
"milestones_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/milestones{/number}",
"notifications_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/notifications{?since,all,participating}",
"labels_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels{/name}",
"releases_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/releases{/id}",
"deployments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/deployments",
"created_at": "2023-03-16T09:21:07Z",
"updated_at": "2024-11-11T18:16:29Z",
"pushed_at": "2024-11-11T18:34:52Z",
"git_url": "git://github.com/Significant-Gravitas/AutoGPT.git",
"ssh_url": "git@github.com:Significant-Gravitas/AutoGPT.git",
"clone_url": "https://github.com/Significant-Gravitas/AutoGPT.git",
"svn_url": "https://github.com/Significant-Gravitas/AutoGPT",
"homepage": "https://agpt.co",
"size": 181894,
"stargazers_count": 168203,
"watchers_count": 168203,
"language": "Python",
"has_issues": true,
"has_projects": true,
"has_downloads": true,
"has_wiki": true,
"has_pages": false,
"has_discussions": true,
"forks_count": 44376,
"mirror_url": null,
"archived": false,
"disabled": false,
"open_issues_count": 189,
"license": {
"key": "other",
"name": "Other",
"spdx_id": "NOASSERTION",
"url": null,
"node_id": "MDc6TGljZW5zZTA="
},
"allow_forking": true,
"is_template": false,
"web_commit_signoff_required": false,
"topics": [
"ai",
"artificial-intelligence",
"autonomous-agents",
"gpt-4",
"openai",
"python"
],
"visibility": "public",
"forks": 44376,
"open_issues": 189,
"watchers": 168203,
"default_branch": "master",
"allow_squash_merge": true,
"allow_merge_commit": false,
"allow_rebase_merge": false,
"allow_auto_merge": true,
"delete_branch_on_merge": true,
"allow_update_branch": true,
"use_squash_pr_title_as_default": true,
"squash_merge_commit_message": "COMMIT_MESSAGES",
"squash_merge_commit_title": "PR_TITLE",
"merge_commit_message": "BLANK",
"merge_commit_title": "PR_TITLE"
}
},
"base": {
"label": "Significant-Gravitas:dev",
"ref": "dev",
"sha": "0b5b95eff5e18c1e162d2b30b66a7be2bed1cbc2",
"user": {
"login": "Significant-Gravitas",
"id": 130738209,
"node_id": "O_kgDOB8roIQ",
"avatar_url": "https://avatars.githubusercontent.com/u/130738209?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Significant-Gravitas",
"html_url": "https://github.com/Significant-Gravitas",
"followers_url": "https://api.github.com/users/Significant-Gravitas/followers",
"following_url": "https://api.github.com/users/Significant-Gravitas/following{/other_user}",
"gists_url": "https://api.github.com/users/Significant-Gravitas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Significant-Gravitas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Significant-Gravitas/subscriptions",
"organizations_url": "https://api.github.com/users/Significant-Gravitas/orgs",
"repos_url": "https://api.github.com/users/Significant-Gravitas/repos",
"events_url": "https://api.github.com/users/Significant-Gravitas/events{/privacy}",
"received_events_url": "https://api.github.com/users/Significant-Gravitas/received_events",
"type": "Organization",
"user_view_type": "public",
"site_admin": false
},
"repo": {
"id": 614765452,
"node_id": "R_kgDOJKSTjA",
"name": "AutoGPT",
"full_name": "Significant-Gravitas/AutoGPT",
"private": false,
"owner": {
"login": "Significant-Gravitas",
"id": 130738209,
"node_id": "O_kgDOB8roIQ",
"avatar_url": "https://avatars.githubusercontent.com/u/130738209?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Significant-Gravitas",
"html_url": "https://github.com/Significant-Gravitas",
"followers_url": "https://api.github.com/users/Significant-Gravitas/followers",
"following_url": "https://api.github.com/users/Significant-Gravitas/following{/other_user}",
"gists_url": "https://api.github.com/users/Significant-Gravitas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Significant-Gravitas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Significant-Gravitas/subscriptions",
"organizations_url": "https://api.github.com/users/Significant-Gravitas/orgs",
"repos_url": "https://api.github.com/users/Significant-Gravitas/repos",
"events_url": "https://api.github.com/users/Significant-Gravitas/events{/privacy}",
"received_events_url": "https://api.github.com/users/Significant-Gravitas/received_events",
"type": "Organization",
"user_view_type": "public",
"site_admin": false
},
"html_url": "https://github.com/Significant-Gravitas/AutoGPT",
"description": "AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.",
"fork": false,
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT",
"forks_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/forks",
"keys_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/keys{/key_id}",
"collaborators_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/collaborators{/collaborator}",
"teams_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/teams",
"hooks_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/hooks",
"issue_events_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/events{/number}",
"events_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/events",
"assignees_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/assignees{/user}",
"branches_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/branches{/branch}",
"tags_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/tags",
"blobs_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/blobs{/sha}",
"git_tags_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/tags{/sha}",
"git_refs_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/refs{/sha}",
"trees_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/trees{/sha}",
"statuses_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/statuses/{sha}",
"languages_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/languages",
"stargazers_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/stargazers",
"contributors_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/contributors",
"subscribers_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/subscribers",
"subscription_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/subscription",
"commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/commits{/sha}",
"git_commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/commits{/sha}",
"comments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/comments{/number}",
"issue_comment_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/comments{/number}",
"contents_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/contents/{+path}",
"compare_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/compare/{base}...{head}",
"merges_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/merges",
"archive_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/{archive_format}{/ref}",
"downloads_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/downloads",
"issues_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues{/number}",
"pulls_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls{/number}",
"milestones_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/milestones{/number}",
"notifications_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/notifications{?since,all,participating}",
"labels_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels{/name}",
"releases_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/releases{/id}",
"deployments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/deployments",
"created_at": "2023-03-16T09:21:07Z",
"updated_at": "2024-11-11T18:16:29Z",
"pushed_at": "2024-11-11T18:34:52Z",
"git_url": "git://github.com/Significant-Gravitas/AutoGPT.git",
"ssh_url": "git@github.com:Significant-Gravitas/AutoGPT.git",
"clone_url": "https://github.com/Significant-Gravitas/AutoGPT.git",
"svn_url": "https://github.com/Significant-Gravitas/AutoGPT",
"homepage": "https://agpt.co",
"size": 181894,
"stargazers_count": 168203,
"watchers_count": 168203,
"language": "Python",
"has_issues": true,
"has_projects": true,
"has_downloads": true,
"has_wiki": true,
"has_pages": false,
"has_discussions": true,
"forks_count": 44376,
"mirror_url": null,
"archived": false,
"disabled": false,
"open_issues_count": 189,
"license": {
"key": "other",
"name": "Other",
"spdx_id": "NOASSERTION",
"url": null,
"node_id": "MDc6TGljZW5zZTA="
},
"allow_forking": true,
"is_template": false,
"web_commit_signoff_required": false,
"topics": [
"ai",
"artificial-intelligence",
"autonomous-agents",
"gpt-4",
"openai",
"python"
],
"visibility": "public",
"forks": 44376,
"open_issues": 189,
"watchers": 168203,
"default_branch": "master",
"allow_squash_merge": true,
"allow_merge_commit": false,
"allow_rebase_merge": false,
"allow_auto_merge": true,
"delete_branch_on_merge": true,
"allow_update_branch": true,
"use_squash_pr_title_as_default": true,
"squash_merge_commit_message": "COMMIT_MESSAGES",
"squash_merge_commit_title": "PR_TITLE",
"merge_commit_message": "BLANK",
"merge_commit_title": "PR_TITLE"
}
},
"_links": {
"self": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/8358"
},
"html": {
"href": "https://github.com/Significant-Gravitas/AutoGPT/pull/8358"
},
"issue": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/8358"
},
"comments": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/8358/comments"
},
"review_comments": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/8358/comments"
},
"review_comment": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/comments{/number}"
},
"commits": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls/8358/commits"
},
"statuses": {
"href": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/statuses/8f708a2b60463eec10747d8f45dead35b5a45bd0"
}
},
"author_association": "MEMBER",
"auto_merge": null,
"active_lock_reason": null,
"merged": false,
"mergeable": null,
"rebaseable": null,
"mergeable_state": "unknown",
"merged_by": null,
"comments": 12,
"review_comments": 29,
"maintainer_can_modify": false,
"commits": 62,
"additions": 1674,
"deletions": 331,
"changed_files": 36
},
"before": "f40aef87672203f47bbbd53f83fae0964c5624da",
"after": "8f708a2b60463eec10747d8f45dead35b5a45bd0",
"repository": {
"id": 614765452,
"node_id": "R_kgDOJKSTjA",
"name": "AutoGPT",
"full_name": "Significant-Gravitas/AutoGPT",
"private": false,
"owner": {
"login": "Significant-Gravitas",
"id": 130738209,
"node_id": "O_kgDOB8roIQ",
"avatar_url": "https://avatars.githubusercontent.com/u/130738209?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Significant-Gravitas",
"html_url": "https://github.com/Significant-Gravitas",
"followers_url": "https://api.github.com/users/Significant-Gravitas/followers",
"following_url": "https://api.github.com/users/Significant-Gravitas/following{/other_user}",
"gists_url": "https://api.github.com/users/Significant-Gravitas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Significant-Gravitas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Significant-Gravitas/subscriptions",
"organizations_url": "https://api.github.com/users/Significant-Gravitas/orgs",
"repos_url": "https://api.github.com/users/Significant-Gravitas/repos",
"events_url": "https://api.github.com/users/Significant-Gravitas/events{/privacy}",
"received_events_url": "https://api.github.com/users/Significant-Gravitas/received_events",
"type": "Organization",
"user_view_type": "public",
"site_admin": false
},
"html_url": "https://github.com/Significant-Gravitas/AutoGPT",
"description": "AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.",
"fork": false,
"url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT",
"forks_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/forks",
"keys_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/keys{/key_id}",
"collaborators_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/collaborators{/collaborator}",
"teams_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/teams",
"hooks_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/hooks",
"issue_events_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/events{/number}",
"events_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/events",
"assignees_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/assignees{/user}",
"branches_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/branches{/branch}",
"tags_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/tags",
"blobs_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/blobs{/sha}",
"git_tags_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/tags{/sha}",
"git_refs_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/refs{/sha}",
"trees_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/trees{/sha}",
"statuses_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/statuses/{sha}",
"languages_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/languages",
"stargazers_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/stargazers",
"contributors_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/contributors",
"subscribers_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/subscribers",
"subscription_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/subscription",
"commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/commits{/sha}",
"git_commits_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/git/commits{/sha}",
"comments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/comments{/number}",
"issue_comment_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues/comments{/number}",
"contents_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/contents/{+path}",
"compare_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/compare/{base}...{head}",
"merges_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/merges",
"archive_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/{archive_format}{/ref}",
"downloads_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/downloads",
"issues_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/issues{/number}",
"pulls_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/pulls{/number}",
"milestones_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/milestones{/number}",
"notifications_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/notifications{?since,all,participating}",
"labels_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/labels{/name}",
"releases_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/releases{/id}",
"deployments_url": "https://api.github.com/repos/Significant-Gravitas/AutoGPT/deployments",
"created_at": "2023-03-16T09:21:07Z",
"updated_at": "2024-11-11T18:16:29Z",
"pushed_at": "2024-11-11T18:34:52Z",
"git_url": "git://github.com/Significant-Gravitas/AutoGPT.git",
"ssh_url": "git@github.com:Significant-Gravitas/AutoGPT.git",
"clone_url": "https://github.com/Significant-Gravitas/AutoGPT.git",
"svn_url": "https://github.com/Significant-Gravitas/AutoGPT",
"homepage": "https://agpt.co",
"size": 181894,
"stargazers_count": 168203,
"watchers_count": 168203,
"language": "Python",
"has_issues": true,
"has_projects": true,
"has_downloads": true,
"has_wiki": true,
"has_pages": false,
"has_discussions": true,
"forks_count": 44376,
"mirror_url": null,
"archived": false,
"disabled": false,
"open_issues_count": 189,
"license": {
"key": "other",
"name": "Other",
"spdx_id": "NOASSERTION",
"url": null,
"node_id": "MDc6TGljZW5zZTA="
},
"allow_forking": true,
"is_template": false,
"web_commit_signoff_required": false,
"topics": [
"ai",
"artificial-intelligence",
"autonomous-agents",
"gpt-4",
"openai",
"python"
],
"visibility": "public",
"forks": 44376,
"open_issues": 189,
"watchers": 168203,
"default_branch": "master",
"custom_properties": {
}
},
"organization": {
"login": "Significant-Gravitas",
"id": 130738209,
"node_id": "O_kgDOB8roIQ",
"url": "https://api.github.com/orgs/Significant-Gravitas",
"repos_url": "https://api.github.com/orgs/Significant-Gravitas/repos",
"events_url": "https://api.github.com/orgs/Significant-Gravitas/events",
"hooks_url": "https://api.github.com/orgs/Significant-Gravitas/hooks",
"issues_url": "https://api.github.com/orgs/Significant-Gravitas/issues",
"members_url": "https://api.github.com/orgs/Significant-Gravitas/members{/member}",
"public_members_url": "https://api.github.com/orgs/Significant-Gravitas/public_members{/member}",
"avatar_url": "https://avatars.githubusercontent.com/u/130738209?v=4",
"description": ""
},
"enterprise": {
"id": 149607,
"slug": "significant-gravitas",
"name": "Significant Gravitas",
"node_id": "E_kgDOAAJIZw",
"avatar_url": "https://avatars.githubusercontent.com/b/149607?v=4",
"description": "The creators of AutoGPT",
"website_url": "discord.gg/autogpt",
"html_url": "https://github.com/enterprises/significant-gravitas",
"created_at": "2024-04-18T17:43:53Z",
"updated_at": "2024-10-23T16:59:55Z"
},
"sender": {
"login": "Pwuts",
"id": 12185583,
"node_id": "MDQ6VXNlcjEyMTg1NTgz",
"avatar_url": "https://avatars.githubusercontent.com/u/12185583?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Pwuts",
"html_url": "https://github.com/Pwuts",
"followers_url": "https://api.github.com/users/Pwuts/followers",
"following_url": "https://api.github.com/users/Pwuts/following{/other_user}",
"gists_url": "https://api.github.com/users/Pwuts/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Pwuts/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Pwuts/subscriptions",
"organizations_url": "https://api.github.com/users/Pwuts/orgs",
"repos_url": "https://api.github.com/users/Pwuts/repos",
"events_url": "https://api.github.com/users/Pwuts/events{/privacy}",
"received_events_url": "https://api.github.com/users/Pwuts/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
}

View File

@ -1,9 +1,11 @@
import requests
from urllib.parse import urlparse
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from ._api import get_api
from ._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
@ -13,6 +15,10 @@ from ._auth import (
)
def is_github_url(url: str) -> bool:
return urlparse(url).netloc == "github.com"
# --8<-- [start:GithubCommentBlockExample]
class GithubCommentBlock(Block):
class Input(BlockSchema):
@ -40,15 +46,27 @@ class GithubCommentBlock(Block):
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubCommentBlock.Input,
output_schema=GithubCommentBlock.Output,
test_input={
"issue_url": "https://github.com/owner/repo/issues/1",
"comment": "This is a test comment.",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_input=[
{
"issue_url": "https://github.com/owner/repo/issues/1",
"comment": "This is a test comment.",
"credentials": TEST_CREDENTIALS_INPUT,
},
{
"issue_url": "https://github.com/owner/repo/pull/1",
"comment": "This is a test comment.",
"credentials": TEST_CREDENTIALS_INPUT,
},
],
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", 1337),
("url", "https://github.com/owner/repo/issues/1#issuecomment-1337"),
("id", 1337),
(
"url",
"https://github.com/owner/repo/issues/1#issuecomment-1337",
),
],
test_mock={
"post_comment": lambda *args, **kwargs: (
@ -62,27 +80,12 @@ class GithubCommentBlock(Block):
def post_comment(
credentials: GithubCredentials, issue_url: str, body_text: str
) -> tuple[int, str]:
if "/pull/" in issue_url:
api_url = (
issue_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/issues/"
)
+ "/comments"
)
else:
api_url = (
issue_url.replace("github.com", "api.github.com/repos") + "/comments"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
data = {"body": body_text}
response = requests.post(api_url, headers=headers, json=data)
response.raise_for_status()
if "pull" in issue_url:
issue_url = issue_url.replace("pull", "issues")
comments_url = issue_url + "/comments"
response = api.post(comments_url, json=data)
comment = response.json()
return comment["id"], comment["html_url"]
@ -156,16 +159,10 @@ class GithubMakeIssueBlock(Block):
def create_issue(
credentials: GithubCredentials, repo_url: str, title: str, body: str
) -> tuple[int, str]:
api_url = repo_url.replace("github.com", "api.github.com/repos") + "/issues"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
data = {"title": title, "body": body}
response = requests.post(api_url, headers=headers, json=data)
response.raise_for_status()
issues_url = repo_url + "/issues"
response = api.post(issues_url, json=data)
issue = response.json()
return issue["number"], issue["html_url"]
@ -232,21 +229,12 @@ class GithubReadIssueBlock(Block):
def read_issue(
credentials: GithubCredentials, issue_url: str
) -> tuple[str, str, str]:
api_url = issue_url.replace("github.com", "api.github.com/repos")
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
response = api.get(issue_url)
data = response.json()
title = data.get("title", "No title found")
body = data.get("body", "No body content found")
user = data.get("user", {}).get("login", "No user found")
return title, body, user
def run(
@ -260,9 +248,12 @@ class GithubReadIssueBlock(Block):
credentials,
input_data.issue_url,
)
yield "title", title
yield "body", body
yield "user", user
if title:
yield "title", title
if body:
yield "body", body
if user:
yield "user", user
class GithubListIssuesBlock(Block):
@ -318,20 +309,13 @@ class GithubListIssuesBlock(Block):
def list_issues(
credentials: GithubCredentials, repo_url: str
) -> list[Output.IssueItem]:
api_url = repo_url.replace("github.com", "api.github.com/repos") + "/issues"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
issues_url = repo_url + "/issues"
response = api.get(issues_url)
data = response.json()
issues: list[GithubListIssuesBlock.Output.IssueItem] = [
{"title": issue["title"], "url": issue["html_url"]} for issue in data
]
return issues
def run(
@ -385,28 +369,10 @@ class GithubAddLabelBlock(Block):
@staticmethod
def add_label(credentials: GithubCredentials, issue_url: str, label: str) -> str:
# Convert the provided GitHub URL to the API URL
if "/pull/" in issue_url:
api_url = (
issue_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/issues/"
)
+ "/labels"
)
else:
api_url = (
issue_url.replace("github.com", "api.github.com/repos") + "/labels"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
data = {"labels": [label]}
response = requests.post(api_url, headers=headers, json=data)
response.raise_for_status()
labels_url = issue_url + "/labels"
api.post(labels_url, json=data)
return "Label added successfully"
def run(
@ -463,31 +429,9 @@ class GithubRemoveLabelBlock(Block):
@staticmethod
def remove_label(credentials: GithubCredentials, issue_url: str, label: str) -> str:
# Convert the provided GitHub URL to the API URL
if "/pull/" in issue_url:
api_url = (
issue_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/issues/"
)
+ f"/labels/{label}"
)
else:
api_url = (
issue_url.replace("github.com", "api.github.com/repos")
+ f"/labels/{label}"
)
# Log the constructed API URL for debugging
print(f"Constructed API URL: {api_url}")
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.delete(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
label_url = issue_url + f"/labels/{label}"
api.delete(label_url)
return "Label removed successfully"
def run(
@ -550,23 +494,10 @@ class GithubAssignIssueBlock(Block):
issue_url: str,
assignee: str,
) -> str:
# Extracting repo path and issue number from the issue URL
repo_path, issue_number = issue_url.replace("https://github.com/", "").split(
"/issues/"
)
api_url = (
f"https://api.github.com/repos/{repo_path}/issues/{issue_number}/assignees"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
assignees_url = issue_url + "/assignees"
data = {"assignees": [assignee]}
response = requests.post(api_url, headers=headers, json=data)
response.raise_for_status()
api.post(assignees_url, json=data)
return "Issue assigned successfully"
def run(
@ -629,23 +560,10 @@ class GithubUnassignIssueBlock(Block):
issue_url: str,
assignee: str,
) -> str:
# Extracting repo path and issue number from the issue URL
repo_path, issue_number = issue_url.replace("https://github.com/", "").split(
"/issues/"
)
api_url = (
f"https://api.github.com/repos/{repo_path}/issues/{issue_number}/assignees"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
assignees_url = issue_url + "/assignees"
data = {"assignees": [assignee]}
response = requests.delete(api_url, headers=headers, json=data)
response.raise_for_status()
api.delete(assignees_url, json=data)
return "Issue unassigned successfully"
def run(

View File

@ -1,9 +1,11 @@
import requests
import re
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from ._api import get_api
from ._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
@ -64,20 +66,13 @@ class GithubListPullRequestsBlock(Block):
@staticmethod
def list_prs(credentials: GithubCredentials, repo_url: str) -> list[Output.PRItem]:
api_url = repo_url.replace("github.com", "api.github.com/repos") + "/pulls"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
pulls_url = repo_url + "/pulls"
response = api.get(pulls_url)
data = response.json()
pull_requests: list[GithubListPullRequestsBlock.Output.PRItem] = [
{"title": pr["title"], "url": pr["html_url"]} for pr in data
]
return pull_requests
def run(
@ -110,7 +105,11 @@ class GithubMakePullRequestBlock(Block):
placeholder="Enter the pull request body",
)
head: str = SchemaField(
description="The name of the branch where your changes are implemented. For cross-repository pull requests in the same network, namespace head with a user like this: username:branch.",
description=(
"The name of the branch where your changes are implemented. "
"For cross-repository pull requests in the same network, "
"namespace head with a user like this: username:branch."
),
placeholder="Enter the head branch",
)
base: str = SchemaField(
@ -162,17 +161,10 @@ class GithubMakePullRequestBlock(Block):
head: str,
base: str,
) -> tuple[int, str]:
repo_path = repo_url.replace("https://github.com/", "")
api_url = f"https://api.github.com/repos/{repo_path}/pulls"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
pulls_url = repo_url + "/pulls"
data = {"title": title, "body": body, "head": head, "base": base}
response = requests.post(api_url, headers=headers, json=data)
response.raise_for_status()
response = api.post(pulls_url, json=data)
pr_data = response.json()
return pr_data["number"], pr_data["html_url"]
@ -194,13 +186,8 @@ class GithubMakePullRequestBlock(Block):
)
yield "number", number
yield "url", url
except requests.exceptions.HTTPError as http_err:
if http_err.response.status_code == 422:
error_details = http_err.response.json()
error_message = error_details.get("message", "Unknown error")
else:
error_message = str(http_err)
raise RuntimeError(f"Failed to create pull request: {error_message}")
except Exception as e:
yield "error", str(e)
class GithubReadPullRequestBlock(Block):
@ -255,42 +242,21 @@ class GithubReadPullRequestBlock(Block):
@staticmethod
def read_pr(credentials: GithubCredentials, pr_url: str) -> tuple[str, str, str]:
api_url = pr_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/issues/"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
# Adjust the URL to access the issue endpoint for PR metadata
issue_url = pr_url.replace("/pull/", "/issues/")
response = api.get(issue_url)
data = response.json()
title = data.get("title", "No title found")
body = data.get("body", "No body content found")
author = data.get("user", {}).get("login", "No user found")
return title, body, author
@staticmethod
def read_pr_changes(credentials: GithubCredentials, pr_url: str) -> str:
api_url = (
pr_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/pulls/"
)
+ "/files"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
files_url = prepare_pr_api_url(pr_url=pr_url, path="files")
response = api.get(files_url)
files = response.json()
changes = []
for file in files:
@ -298,7 +264,6 @@ class GithubReadPullRequestBlock(Block):
patch = file.get("patch")
if filename and patch:
changes.append(f"File: {filename}\n{patch}")
return "\n\n".join(changes)
def run(
@ -367,23 +332,10 @@ class GithubAssignPRReviewerBlock(Block):
def assign_reviewer(
credentials: GithubCredentials, pr_url: str, reviewer: str
) -> str:
# Convert the PR URL to the appropriate API endpoint
api_url = (
pr_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/pulls/"
)
+ "/requested_reviewers"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
reviewers_url = prepare_pr_api_url(pr_url=pr_url, path="requested_reviewers")
data = {"reviewers": [reviewer]}
response = requests.post(api_url, headers=headers, json=data)
response.raise_for_status()
api.post(reviewers_url, json=data)
return "Reviewer assigned successfully"
def run(
@ -400,17 +352,8 @@ class GithubAssignPRReviewerBlock(Block):
input_data.reviewer,
)
yield "status", status
except requests.exceptions.HTTPError as http_err:
if http_err.response.status_code == 422:
error_msg = (
"Failed to assign reviewer: "
f"The reviewer '{input_data.reviewer}' may not have permission "
"or the pull request is not in a valid state. "
f"Detailed error: {http_err.response.text}"
)
else:
error_msg = f"HTTP error: {http_err} - {http_err.response.text}"
raise RuntimeError(error_msg)
except Exception as e:
yield "error", str(e)
class GithubUnassignPRReviewerBlock(Block):
@ -456,21 +399,10 @@ class GithubUnassignPRReviewerBlock(Block):
def unassign_reviewer(
credentials: GithubCredentials, pr_url: str, reviewer: str
) -> str:
api_url = (
pr_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/pulls/"
)
+ "/requested_reviewers"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
api = get_api(credentials)
reviewers_url = prepare_pr_api_url(pr_url=pr_url, path="requested_reviewers")
data = {"reviewers": [reviewer]}
response = requests.delete(api_url, headers=headers, json=data)
response.raise_for_status()
api.delete(reviewers_url, json=data)
return "Reviewer unassigned successfully"
def run(
@ -480,12 +412,15 @@ class GithubUnassignPRReviewerBlock(Block):
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
status = self.unassign_reviewer(
credentials,
input_data.pr_url,
input_data.reviewer,
)
yield "status", status
try:
status = self.unassign_reviewer(
credentials,
input_data.pr_url,
input_data.reviewer,
)
yield "status", status
except Exception as e:
yield "error", str(e)
class GithubListPRReviewersBlock(Block):
@ -544,26 +479,14 @@ class GithubListPRReviewersBlock(Block):
def list_reviewers(
credentials: GithubCredentials, pr_url: str
) -> list[Output.ReviewerItem]:
api_url = (
pr_url.replace("github.com", "api.github.com/repos").replace(
"/pull/", "/pulls/"
)
+ "/requested_reviewers"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
reviewers_url = prepare_pr_api_url(pr_url=pr_url, path="requested_reviewers")
response = api.get(reviewers_url)
data = response.json()
reviewers: list[GithubListPRReviewersBlock.Output.ReviewerItem] = [
{"username": reviewer["login"], "url": reviewer["html_url"]}
for reviewer in data.get("users", [])
]
return reviewers
def run(
@ -578,3 +501,14 @@ class GithubListPRReviewersBlock(Block):
input_data.pr_url,
)
yield from (("reviewer", reviewer) for reviewer in reviewers)
def prepare_pr_api_url(pr_url: str, path: str) -> str:
# Pattern to capture the base repository URL and the pull request number
pattern = r"^(?:https?://)?([^/]+/[^/]+/[^/]+)/pull/(\d+)"
match = re.match(pattern, pr_url)
if not match:
return pr_url
base_url, pr_number = match.groups()
return f"{base_url}/pulls/{pr_number}/{path}"

View File

@ -1,11 +1,11 @@
import base64
import requests
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from ._api import get_api
from ._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
@ -68,17 +68,11 @@ class GithubListTagsBlock(Block):
def list_tags(
credentials: GithubCredentials, repo_url: str
) -> list[Output.TagItem]:
repo_path = repo_url.replace("https://github.com/", "")
api_url = f"https://api.github.com/repos/{repo_path}/tags"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
tags_url = repo_url + "/tags"
response = api.get(tags_url)
data = response.json()
repo_path = repo_url.replace("https://github.com/", "")
tags: list[GithubListTagsBlock.Output.TagItem] = [
{
"name": tag["name"],
@ -86,7 +80,6 @@ class GithubListTagsBlock(Block):
}
for tag in data
]
return tags
def run(
@ -157,20 +150,18 @@ class GithubListBranchesBlock(Block):
def list_branches(
credentials: GithubCredentials, repo_url: str
) -> list[Output.BranchItem]:
api_url = repo_url.replace("github.com", "api.github.com/repos") + "/branches"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
branches_url = repo_url + "/branches"
response = api.get(branches_url)
data = response.json()
repo_path = repo_url.replace("https://github.com/", "")
branches: list[GithubListBranchesBlock.Output.BranchItem] = [
{"name": branch["name"], "url": branch["commit"]["url"]} for branch in data
{
"name": branch["name"],
"url": f"https://github.com/{repo_path}/tree/{branch['name']}",
}
for branch in data
]
return branches
def run(
@ -246,6 +237,8 @@ class GithubListDiscussionsBlock(Block):
def list_discussions(
credentials: GithubCredentials, repo_url: str, num_discussions: int
) -> list[Output.DiscussionItem]:
api = get_api(credentials)
# GitHub GraphQL API endpoint is different; we'll use api.post with custom URL
repo_path = repo_url.replace("https://github.com/", "")
owner, repo = repo_path.split("/")
query = """
@ -261,24 +254,15 @@ class GithubListDiscussionsBlock(Block):
}
"""
variables = {"owner": owner, "repo": repo, "num": num_discussions}
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.post(
response = api.post(
"https://api.github.com/graphql",
json={"query": query, "variables": variables},
headers=headers,
)
response.raise_for_status()
data = response.json()
discussions: list[GithubListDiscussionsBlock.Output.DiscussionItem] = [
{"title": discussion["title"], "url": discussion["url"]}
for discussion in data["data"]["repository"]["discussions"]["nodes"]
]
return discussions
def run(
@ -348,21 +332,13 @@ class GithubListReleasesBlock(Block):
def list_releases(
credentials: GithubCredentials, repo_url: str
) -> list[Output.ReleaseItem]:
repo_path = repo_url.replace("https://github.com/", "")
api_url = f"https://api.github.com/repos/{repo_path}/releases"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
releases_url = repo_url + "/releases"
response = api.get(releases_url)
data = response.json()
releases: list[GithubListReleasesBlock.Output.ReleaseItem] = [
{"name": release["name"], "url": release["html_url"]} for release in data
]
return releases
def run(
@ -432,16 +408,9 @@ class GithubReadFileBlock(Block):
def read_file(
credentials: GithubCredentials, repo_url: str, file_path: str, branch: str
) -> tuple[str, int]:
repo_path = repo_url.replace("https://github.com/", "")
api_url = f"https://api.github.com/repos/{repo_path}/contents/{file_path}?ref={branch}"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
content_url = repo_url + f"/contents/{file_path}?ref={branch}"
response = api.get(content_url)
content = response.json()
if isinstance(content, list):
@ -549,46 +518,33 @@ class GithubReadFolderBlock(Block):
def read_folder(
credentials: GithubCredentials, repo_url: str, folder_path: str, branch: str
) -> tuple[list[Output.FileEntry], list[Output.DirEntry]]:
repo_path = repo_url.replace("https://github.com/", "")
api_url = f"https://api.github.com/repos/{repo_path}/contents/{folder_path}?ref={branch}"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
contents_url = repo_url + f"/contents/{folder_path}?ref={branch}"
response = api.get(contents_url)
content = response.json()
if isinstance(content, list):
# Multiple entries of different types exist at this path
if not (dir := next((d for d in content if d["type"] == "dir"), None)):
raise TypeError("Not a folder")
content = dir
if content["type"] != "dir":
if not isinstance(content, list):
raise TypeError("Not a folder")
return (
[
GithubReadFolderBlock.Output.FileEntry(
name=entry["name"],
path=entry["path"],
size=entry["size"],
)
for entry in content["entries"]
if entry["type"] == "file"
],
[
GithubReadFolderBlock.Output.DirEntry(
name=entry["name"],
path=entry["path"],
)
for entry in content["entries"]
if entry["type"] == "dir"
],
)
files = [
GithubReadFolderBlock.Output.FileEntry(
name=entry["name"],
path=entry["path"],
size=entry["size"],
)
for entry in content
if entry["type"] == "file"
]
dirs = [
GithubReadFolderBlock.Output.DirEntry(
name=entry["name"],
path=entry["path"],
)
for entry in content
if entry["type"] == "dir"
]
return files, dirs
def run(
self,
@ -656,26 +612,16 @@ class GithubMakeBranchBlock(Block):
new_branch: str,
source_branch: str,
) -> str:
repo_path = repo_url.replace("https://github.com/", "")
ref_api_url = (
f"https://api.github.com/repos/{repo_path}/git/refs/heads/{source_branch}"
)
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.get(ref_api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
# Get the SHA of the source branch
ref_url = repo_url + f"/git/refs/heads/{source_branch}"
response = api.get(ref_url)
sha = response.json()["object"]["sha"]
create_branch_api_url = f"https://api.github.com/repos/{repo_path}/git/refs"
# Create the new branch
create_ref_url = repo_url + "/git/refs"
data = {"ref": f"refs/heads/{new_branch}", "sha": sha}
response = requests.post(create_branch_api_url, headers=headers, json=data)
response.raise_for_status()
response = api.post(create_ref_url, json=data)
return "Branch created successfully"
def run(
@ -735,16 +681,9 @@ class GithubDeleteBranchBlock(Block):
def delete_branch(
credentials: GithubCredentials, repo_url: str, branch: str
) -> str:
repo_path = repo_url.replace("https://github.com/", "")
api_url = f"https://api.github.com/repos/{repo_path}/git/refs/heads/{branch}"
headers = {
"Authorization": credentials.bearer(),
"Accept": "application/vnd.github.v3+json",
}
response = requests.delete(api_url, headers=headers)
response.raise_for_status()
api = get_api(credentials)
ref_url = repo_url + f"/git/refs/heads/{branch}"
api.delete(ref_url)
return "Branch deleted successfully"
def run(
@ -760,3 +699,420 @@ class GithubDeleteBranchBlock(Block):
input_data.branch,
)
yield "status", status
class GithubCreateFileBlock(Block):
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
file_path: str = SchemaField(
description="Path where the file should be created",
placeholder="path/to/file.txt",
)
content: str = SchemaField(
description="Content to write to the file",
placeholder="File content here",
)
branch: str = SchemaField(
description="Branch where the file should be created",
default="main",
)
commit_message: str = SchemaField(
description="Message for the commit",
default="Create new file",
)
class Output(BlockSchema):
url: str = SchemaField(description="URL of the created file")
sha: str = SchemaField(description="SHA of the commit")
error: str = SchemaField(
description="Error message if the file creation failed"
)
def __init__(self):
super().__init__(
id="8fd132ac-b917-428a-8159-d62893e8a3fe",
description="This block creates a new file in a GitHub repository.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubCreateFileBlock.Input,
output_schema=GithubCreateFileBlock.Output,
test_input={
"repo_url": "https://github.com/owner/repo",
"file_path": "test/file.txt",
"content": "Test content",
"branch": "main",
"commit_message": "Create test file",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("url", "https://github.com/owner/repo/blob/main/test/file.txt"),
("sha", "abc123"),
],
test_mock={
"create_file": lambda *args, **kwargs: (
"https://github.com/owner/repo/blob/main/test/file.txt",
"abc123",
)
},
)
@staticmethod
def create_file(
credentials: GithubCredentials,
repo_url: str,
file_path: str,
content: str,
branch: str,
commit_message: str,
) -> tuple[str, str]:
api = get_api(credentials)
# Convert content to base64
content_bytes = content.encode("utf-8")
content_base64 = base64.b64encode(content_bytes).decode("utf-8")
# Create the file using the GitHub API
contents_url = f"{repo_url}/contents/{file_path}"
data = {
"message": commit_message,
"content": content_base64,
"branch": branch,
}
response = api.put(contents_url, json=data)
result = response.json()
return result["content"]["html_url"], result["commit"]["sha"]
def run(
self,
input_data: Input,
*,
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
try:
url, sha = self.create_file(
credentials,
input_data.repo_url,
input_data.file_path,
input_data.content,
input_data.branch,
input_data.commit_message,
)
yield "url", url
yield "sha", sha
except Exception as e:
yield "error", str(e)
class GithubUpdateFileBlock(Block):
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
file_path: str = SchemaField(
description="Path to the file to update",
placeholder="path/to/file.txt",
)
content: str = SchemaField(
description="New content for the file",
placeholder="Updated content here",
)
branch: str = SchemaField(
description="Branch containing the file",
default="main",
)
commit_message: str = SchemaField(
description="Message for the commit",
default="Update file",
)
class Output(BlockSchema):
url: str = SchemaField(description="URL of the updated file")
sha: str = SchemaField(description="SHA of the commit")
error: str = SchemaField(description="Error message if the file update failed")
def __init__(self):
super().__init__(
id="30be12a4-57cb-4aa4-baf5-fcc68d136076",
description="This block updates an existing file in a GitHub repository.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubUpdateFileBlock.Input,
output_schema=GithubUpdateFileBlock.Output,
test_input={
"repo_url": "https://github.com/owner/repo",
"file_path": "test/file.txt",
"content": "Updated content",
"branch": "main",
"commit_message": "Update test file",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("url", "https://github.com/owner/repo/blob/main/test/file.txt"),
("sha", "def456"),
],
test_mock={
"update_file": lambda *args, **kwargs: (
"https://github.com/owner/repo/blob/main/test/file.txt",
"def456",
)
},
)
@staticmethod
def update_file(
credentials: GithubCredentials,
repo_url: str,
file_path: str,
content: str,
branch: str,
commit_message: str,
) -> tuple[str, str]:
api = get_api(credentials)
# First get the current file to get its SHA
contents_url = f"{repo_url}/contents/{file_path}"
params = {"ref": branch}
response = api.get(contents_url, params=params)
current_file = response.json()
# Convert new content to base64
content_bytes = content.encode("utf-8")
content_base64 = base64.b64encode(content_bytes).decode("utf-8")
# Update the file
data = {
"message": commit_message,
"content": content_base64,
"sha": current_file["sha"],
"branch": branch,
}
response = api.put(contents_url, json=data)
result = response.json()
return result["content"]["html_url"], result["commit"]["sha"]
def run(
self,
input_data: Input,
*,
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
try:
url, sha = self.update_file(
credentials,
input_data.repo_url,
input_data.file_path,
input_data.content,
input_data.branch,
input_data.commit_message,
)
yield "url", url
yield "sha", sha
except Exception as e:
yield "error", str(e)
class GithubCreateRepositoryBlock(Block):
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
name: str = SchemaField(
description="Name of the repository to create",
placeholder="my-new-repo",
)
description: str = SchemaField(
description="Description of the repository",
placeholder="A description of the repository",
default="",
)
private: bool = SchemaField(
description="Whether the repository should be private",
default=False,
)
auto_init: bool = SchemaField(
description="Whether to initialize the repository with a README",
default=True,
)
gitignore_template: str = SchemaField(
description="Git ignore template to use (e.g., Python, Node, Java)",
default="",
)
class Output(BlockSchema):
url: str = SchemaField(description="URL of the created repository")
clone_url: str = SchemaField(description="Git clone URL of the repository")
error: str = SchemaField(
description="Error message if the repository creation failed"
)
def __init__(self):
super().__init__(
id="029ec3b8-1cfd-46d3-b6aa-28e4a706efd1",
description="This block creates a new GitHub repository.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubCreateRepositoryBlock.Input,
output_schema=GithubCreateRepositoryBlock.Output,
test_input={
"name": "test-repo",
"description": "A test repository",
"private": False,
"auto_init": True,
"gitignore_template": "Python",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("url", "https://github.com/owner/test-repo"),
("clone_url", "https://github.com/owner/test-repo.git"),
],
test_mock={
"create_repository": lambda *args, **kwargs: (
"https://github.com/owner/test-repo",
"https://github.com/owner/test-repo.git",
)
},
)
@staticmethod
def create_repository(
credentials: GithubCredentials,
name: str,
description: str,
private: bool,
auto_init: bool,
gitignore_template: str,
) -> tuple[str, str]:
api = get_api(credentials, convert_urls=False) # Disable URL conversion
data = {
"name": name,
"description": description,
"private": private,
"auto_init": auto_init,
}
if gitignore_template:
data["gitignore_template"] = gitignore_template
# Create repository using the user endpoint
response = api.post("https://api.github.com/user/repos", json=data)
result = response.json()
return result["html_url"], result["clone_url"]
def run(
self,
input_data: Input,
*,
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
try:
url, clone_url = self.create_repository(
credentials,
input_data.name,
input_data.description,
input_data.private,
input_data.auto_init,
input_data.gitignore_template,
)
yield "url", url
yield "clone_url", clone_url
except Exception as e:
yield "error", str(e)
class GithubListStargazersBlock(Block):
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class StargazerItem(TypedDict):
username: str
url: str
stargazer: StargazerItem = SchemaField(
title="Stargazer",
description="Stargazers with their username and profile URL",
)
error: str = SchemaField(
description="Error message if listing stargazers failed"
)
def __init__(self):
super().__init__(
id="a4b9c2d1-e5f6-4g7h-8i9j-0k1l2m3n4o5p", # Generated unique UUID
description="This block lists all users who have starred a specified GitHub repository.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubListStargazersBlock.Input,
output_schema=GithubListStargazersBlock.Output,
test_input={
"repo_url": "https://github.com/owner/repo",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"stargazer",
{
"username": "octocat",
"url": "https://github.com/octocat",
},
)
],
test_mock={
"list_stargazers": lambda *args, **kwargs: [
{
"username": "octocat",
"url": "https://github.com/octocat",
}
]
},
)
@staticmethod
def list_stargazers(
credentials: GithubCredentials, repo_url: str
) -> list[Output.StargazerItem]:
api = get_api(credentials)
# Add /stargazers to the repo URL to get stargazers endpoint
stargazers_url = f"{repo_url}/stargazers"
# Set accept header to get starred_at timestamp
headers = {"Accept": "application/vnd.github.star+json"}
response = api.get(stargazers_url, headers=headers)
data = response.json()
stargazers: list[GithubListStargazersBlock.Output.StargazerItem] = [
{
"username": stargazer["login"],
"url": stargazer["html_url"],
}
for stargazer in data
]
return stargazers
def run(
self,
input_data: Input,
*,
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
try:
stargazers = self.list_stargazers(
credentials,
input_data.repo_url,
)
yield from (("stargazer", stargazer) for stargazer in stargazers)
except Exception as e:
yield "error", str(e)

View File

@ -0,0 +1,158 @@
import json
import logging
from pathlib import Path
from pydantic import BaseModel
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockWebhookConfig,
)
from backend.data.model import SchemaField
from ._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
GithubCredentialsField,
GithubCredentialsInput,
)
logger = logging.getLogger(__name__)
# --8<-- [start:GithubTriggerExample]
class GitHubTriggerBase:
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description=(
"Repository to subscribe to.\n\n"
"**Note:** Make sure your GitHub credentials have permissions "
"to create webhooks on this repo."
),
placeholder="{owner}/{repo}",
)
# --8<-- [start:example-payload-field]
payload: dict = SchemaField(hidden=True, default={})
# --8<-- [end:example-payload-field]
class Output(BlockSchema):
payload: dict = SchemaField(
description="The complete webhook payload that was received from GitHub. "
"Includes information about the affected resource (e.g. pull request), "
"the event, and the user who triggered the event."
)
triggered_by_user: dict = SchemaField(
description="Object representing the GitHub user who triggered the event"
)
error: str = SchemaField(
description="Error message if the payload could not be processed"
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "payload", input_data.payload
yield "triggered_by_user", input_data.payload["sender"]
class GithubPullRequestTriggerBlock(GitHubTriggerBase, Block):
EXAMPLE_PAYLOAD_FILE = (
Path(__file__).parent / "example_payloads" / "pull_request.synchronize.json"
)
# --8<-- [start:example-event-filter]
class Input(GitHubTriggerBase.Input):
class EventsFilter(BaseModel):
"""
https://docs.github.com/en/webhooks/webhook-events-and-payloads#pull_request
"""
opened: bool = False
edited: bool = False
closed: bool = False
reopened: bool = False
synchronize: bool = False
assigned: bool = False
unassigned: bool = False
labeled: bool = False
unlabeled: bool = False
converted_to_draft: bool = False
locked: bool = False
unlocked: bool = False
enqueued: bool = False
dequeued: bool = False
milestoned: bool = False
demilestoned: bool = False
ready_for_review: bool = False
review_requested: bool = False
review_request_removed: bool = False
auto_merge_enabled: bool = False
auto_merge_disabled: bool = False
events: EventsFilter = SchemaField(
title="Events", description="The events to subscribe to"
)
# --8<-- [end:example-event-filter]
class Output(GitHubTriggerBase.Output):
event: str = SchemaField(
description="The PR event that triggered the webhook (e.g. 'opened')"
)
number: int = SchemaField(description="The number of the affected pull request")
pull_request: dict = SchemaField(
description="Object representing the affected pull request"
)
pull_request_url: str = SchemaField(
description="The URL of the affected pull request"
)
def __init__(self):
from backend.integrations.webhooks.github import GithubWebhookType
example_payload = json.loads(
self.EXAMPLE_PAYLOAD_FILE.read_text(encoding="utf-8")
)
super().__init__(
id="6c60ec01-8128-419e-988f-96a063ee2fea",
description="This block triggers on pull request events and outputs the event type and payload.",
categories={BlockCategory.DEVELOPER_TOOLS, BlockCategory.INPUT},
input_schema=GithubPullRequestTriggerBlock.Input,
output_schema=GithubPullRequestTriggerBlock.Output,
# --8<-- [start:example-webhook_config]
webhook_config=BlockWebhookConfig(
provider="github",
webhook_type=GithubWebhookType.REPO,
resource_format="{repo}",
event_filter_input="events",
event_format="pull_request.{event}",
),
# --8<-- [end:example-webhook_config]
test_input={
"repo": "Significant-Gravitas/AutoGPT",
"events": {"opened": True, "synchronize": True},
"credentials": TEST_CREDENTIALS_INPUT,
"payload": example_payload,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("payload", example_payload),
("triggered_by_user", example_payload["sender"]),
("event", example_payload["action"]),
("number", example_payload["number"]),
("pull_request", example_payload["pull_request"]),
("pull_request_url", example_payload["pull_request"]["html_url"]),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput: # type: ignore
yield from super().run(input_data, **kwargs)
yield "event", input_data.payload["action"]
yield "number", input_data.payload["number"]
yield "pull_request", input_data.payload["pull_request"]
yield "pull_request_url", input_data.payload["pull_request"]["html_url"]
# --8<-- [end:GithubTriggerExample]

View File

@ -1,9 +1,9 @@
from typing import Literal
from autogpt_libs.supabase_integration_credentials_store.types import OAuth2Credentials
from pydantic import SecretStr
from backend.data.model import CredentialsField, CredentialsMetaInput
from backend.data.model import CredentialsField, CredentialsMetaInput, OAuth2Credentials
from backend.integrations.providers import ProviderName
from backend.util.settings import Secrets
# --8<-- [start:GoogleOAuthIsConfigured]
@ -13,7 +13,9 @@ GOOGLE_OAUTH_IS_CONFIGURED = bool(
)
# --8<-- [end:GoogleOAuthIsConfigured]
GoogleCredentials = OAuth2Credentials
GoogleCredentialsInput = CredentialsMetaInput[Literal["google"], Literal["oauth2"]]
GoogleCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.GOOGLE], Literal["oauth2"]
]
def GoogleCredentialsField(scopes: list[str]) -> GoogleCredentialsInput:
@ -24,8 +26,6 @@ def GoogleCredentialsField(scopes: list[str]) -> GoogleCredentialsInput:
scopes: The authorization scopes needed for the block to work.
"""
return CredentialsField(
provider="google",
supported_credential_types={"oauth2"},
required_scopes=set(scopes),
description="The Google integration requires OAuth2 authentication.",
)

View File

@ -1,11 +1,16 @@
from typing import Literal
import googlemaps
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
@ -34,12 +39,8 @@ class Place(BaseModel):
class GoogleMapsSearchBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal["google_maps"], Literal["api_key"]
] = CredentialsField(
provider="google_maps",
supported_credential_types={"api_key"},
description="Google Maps API Key",
)
Literal[ProviderName.GOOGLE_MAPS], Literal["api_key"]
] = CredentialsField(description="Google Maps API Key")
query: str = SchemaField(
description="Search query for local businesses",
placeholder="e.g., 'restaurants in New York'",

View File

@ -0,0 +1,14 @@
from typing import Any, Optional
from backend.util.request import requests
class GetRequest:
@classmethod
def get_request(
cls, url: str, headers: Optional[dict] = None, json: bool = False
) -> Any:
if headers is None:
headers = {}
response = requests.get(url, headers=headers)
return response.json() if json else response.text

View File

@ -1,10 +1,10 @@
import json
from enum import Enum
import requests
from typing import Any
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class HttpMethod(Enum):
@ -31,9 +31,14 @@ class SendWebRequestBlock(Block):
description="The headers to include in the request",
default={},
)
body: object = SchemaField(
json_format: bool = SchemaField(
title="JSON format",
description="Whether to send and receive body as JSON",
default=True,
)
body: Any = SchemaField(
description="The body of the request",
default={},
default=None,
)
class Output(BlockSchema):
@ -51,20 +56,32 @@ class SendWebRequestBlock(Block):
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
if isinstance(input_data.body, str):
input_data.body = json.loads(input_data.body)
body = input_data.body
if input_data.json_format:
if isinstance(body, str):
try:
# Try to parse as JSON first
body = json.loads(body)
except json.JSONDecodeError:
# If it's not valid JSON and just plain text,
# we should send it as plain text instead
input_data.json_format = False
response = requests.request(
input_data.method.value,
input_data.url,
headers=input_data.headers,
json=input_data.body,
json=body if input_data.json_format else None,
data=body if not input_data.json_format else None,
)
result = response.json() if input_data.json_format else response.text
if response.status_code // 100 == 2:
yield "response", response.json()
yield "response", result
elif response.status_code // 100 == 4:
yield "client_error", response.json()
yield "client_error", result
elif response.status_code // 100 == 5:
yield "server_error", response.json()
yield "server_error", result
else:
raise ValueError(f"Unexpected status code: {response.status_code}")

View File

@ -0,0 +1,35 @@
from typing import Literal
from pydantic import SecretStr
from backend.data.model import APIKeyCredentials, CredentialsField, CredentialsMetaInput
from backend.integrations.providers import ProviderName
HubSpotCredentials = APIKeyCredentials
HubSpotCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.HUBSPOT],
Literal["api_key"],
]
def HubSpotCredentialsField() -> HubSpotCredentialsInput:
"""Creates a HubSpot credentials input on a block."""
return CredentialsField(
description="The HubSpot integration requires an API Key.",
)
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="hubspot",
api_key=SecretStr("mock-hubspot-api-key"),
title="Mock HubSpot API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}

View File

@ -0,0 +1,106 @@
from backend.blocks.hubspot._auth import (
HubSpotCredentials,
HubSpotCredentialsField,
HubSpotCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class HubSpotCompanyBlock(Block):
class Input(BlockSchema):
credentials: HubSpotCredentialsInput = HubSpotCredentialsField()
operation: str = SchemaField(
description="Operation to perform (create, update, get)", default="get"
)
company_data: dict = SchemaField(
description="Company data for create/update operations", default={}
)
domain: str = SchemaField(
description="Company domain for get/update operations", default=""
)
class Output(BlockSchema):
company: dict = SchemaField(description="Company information")
status: str = SchemaField(description="Operation status")
def __init__(self):
super().__init__(
id="3ae02219-d540-47cd-9c78-3ad6c7d9820a",
description="Manages HubSpot companies - create, update, and retrieve company information",
categories={BlockCategory.CRM},
input_schema=HubSpotCompanyBlock.Input,
output_schema=HubSpotCompanyBlock.Output,
)
def run(
self, input_data: Input, *, credentials: HubSpotCredentials, **kwargs
) -> BlockOutput:
base_url = "https://api.hubapi.com/crm/v3/objects/companies"
headers = {
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
"Content-Type": "application/json",
}
if input_data.operation == "create":
response = requests.post(
base_url, headers=headers, json={"properties": input_data.company_data}
)
result = response.json()
yield "company", result
yield "status", "created"
elif input_data.operation == "get":
search_url = f"{base_url}/search"
search_data = {
"filterGroups": [
{
"filters": [
{
"propertyName": "domain",
"operator": "EQ",
"value": input_data.domain,
}
]
}
]
}
response = requests.post(search_url, headers=headers, json=search_data)
result = response.json()
yield "company", result.get("results", [{}])[0]
yield "status", "retrieved"
elif input_data.operation == "update":
# First get company ID by domain
search_response = requests.post(
f"{base_url}/search",
headers=headers,
json={
"filterGroups": [
{
"filters": [
{
"propertyName": "domain",
"operator": "EQ",
"value": input_data.domain,
}
]
}
]
},
)
company_id = search_response.json().get("results", [{}])[0].get("id")
if company_id:
response = requests.patch(
f"{base_url}/{company_id}",
headers=headers,
json={"properties": input_data.company_data},
)
result = response.json()
yield "company", result
yield "status", "updated"
else:
yield "company", {}
yield "status", "company_not_found"

View File

@ -0,0 +1,106 @@
from backend.blocks.hubspot._auth import (
HubSpotCredentials,
HubSpotCredentialsField,
HubSpotCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class HubSpotContactBlock(Block):
class Input(BlockSchema):
credentials: HubSpotCredentialsInput = HubSpotCredentialsField()
operation: str = SchemaField(
description="Operation to perform (create, update, get)", default="get"
)
contact_data: dict = SchemaField(
description="Contact data for create/update operations", default={}
)
email: str = SchemaField(
description="Email address for get/update operations", default=""
)
class Output(BlockSchema):
contact: dict = SchemaField(description="Contact information")
status: str = SchemaField(description="Operation status")
def __init__(self):
super().__init__(
id="5267326e-c4c1-4016-9f54-4e72ad02f813",
description="Manages HubSpot contacts - create, update, and retrieve contact information",
categories={BlockCategory.CRM},
input_schema=HubSpotContactBlock.Input,
output_schema=HubSpotContactBlock.Output,
)
def run(
self, input_data: Input, *, credentials: HubSpotCredentials, **kwargs
) -> BlockOutput:
base_url = "https://api.hubapi.com/crm/v3/objects/contacts"
headers = {
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
"Content-Type": "application/json",
}
if input_data.operation == "create":
response = requests.post(
base_url, headers=headers, json={"properties": input_data.contact_data}
)
result = response.json()
yield "contact", result
yield "status", "created"
elif input_data.operation == "get":
# Search for contact by email
search_url = f"{base_url}/search"
search_data = {
"filterGroups": [
{
"filters": [
{
"propertyName": "email",
"operator": "EQ",
"value": input_data.email,
}
]
}
]
}
response = requests.post(search_url, headers=headers, json=search_data)
result = response.json()
yield "contact", result.get("results", [{}])[0]
yield "status", "retrieved"
elif input_data.operation == "update":
search_response = requests.post(
f"{base_url}/search",
headers=headers,
json={
"filterGroups": [
{
"filters": [
{
"propertyName": "email",
"operator": "EQ",
"value": input_data.email,
}
]
}
]
},
)
contact_id = search_response.json().get("results", [{}])[0].get("id")
if contact_id:
response = requests.patch(
f"{base_url}/{contact_id}",
headers=headers,
json={"properties": input_data.contact_data},
)
result = response.json()
yield "contact", result
yield "status", "updated"
else:
yield "contact", {}
yield "status", "contact_not_found"

View File

@ -0,0 +1,121 @@
from datetime import datetime, timedelta
from backend.blocks.hubspot._auth import (
HubSpotCredentials,
HubSpotCredentialsField,
HubSpotCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class HubSpotEngagementBlock(Block):
class Input(BlockSchema):
credentials: HubSpotCredentialsInput = HubSpotCredentialsField()
operation: str = SchemaField(
description="Operation to perform (send_email, track_engagement)",
default="send_email",
)
email_data: dict = SchemaField(
description="Email data including recipient, subject, content",
default={},
)
contact_id: str = SchemaField(
description="Contact ID for engagement tracking", default=""
)
timeframe_days: int = SchemaField(
description="Number of days to look back for engagement",
default=30,
optional=True,
)
class Output(BlockSchema):
result: dict = SchemaField(description="Operation result")
status: str = SchemaField(description="Operation status")
def __init__(self):
super().__init__(
id="c6524385-7d87-49d6-a470-248bd29ca765",
description="Manages HubSpot engagements - sends emails and tracks engagement metrics",
categories={BlockCategory.CRM, BlockCategory.COMMUNICATION},
input_schema=HubSpotEngagementBlock.Input,
output_schema=HubSpotEngagementBlock.Output,
)
def run(
self, input_data: Input, *, credentials: HubSpotCredentials, **kwargs
) -> BlockOutput:
base_url = "https://api.hubapi.com"
headers = {
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
"Content-Type": "application/json",
}
if input_data.operation == "send_email":
# Using the email send API
email_url = f"{base_url}/crm/v3/objects/emails"
email_data = {
"properties": {
"hs_timestamp": datetime.now().isoformat(),
"hubspot_owner_id": "1", # This should be configurable
"hs_email_direction": "OUTBOUND",
"hs_email_status": "SEND",
"hs_email_subject": input_data.email_data.get("subject"),
"hs_email_text": input_data.email_data.get("content"),
"hs_email_to_email": input_data.email_data.get("recipient"),
}
}
response = requests.post(email_url, headers=headers, json=email_data)
result = response.json()
yield "result", result
yield "status", "email_sent"
elif input_data.operation == "track_engagement":
# Get engagement events for the contact
from_date = datetime.now() - timedelta(days=input_data.timeframe_days)
engagement_url = (
f"{base_url}/crm/v3/objects/contacts/{input_data.contact_id}/engagement"
)
params = {"limit": 100, "after": from_date.isoformat()}
response = requests.get(engagement_url, headers=headers, params=params)
engagements = response.json()
# Process engagement metrics
metrics = {
"email_opens": 0,
"email_clicks": 0,
"email_replies": 0,
"last_engagement": None,
"engagement_score": 0,
}
for engagement in engagements.get("results", []):
eng_type = engagement.get("properties", {}).get("hs_engagement_type")
if eng_type == "EMAIL":
metrics["email_opens"] += 1
elif eng_type == "EMAIL_CLICK":
metrics["email_clicks"] += 1
elif eng_type == "EMAIL_REPLY":
metrics["email_replies"] += 1
# Update last engagement time
eng_time = engagement.get("properties", {}).get("hs_timestamp")
if eng_time and (
not metrics["last_engagement"]
or eng_time > metrics["last_engagement"]
):
metrics["last_engagement"] = eng_time
# Calculate simple engagement score
metrics["engagement_score"] = (
metrics["email_opens"]
+ metrics["email_clicks"] * 2
+ metrics["email_replies"] * 3
)
yield "result", metrics
yield "status", "engagement_tracked"

View File

@ -1,12 +1,18 @@
from enum import Enum
from typing import Any, Dict, Literal, Optional
import requests
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from requests.exceptions import RequestException
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util.request import requests
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
@ -78,13 +84,10 @@ class UpscaleOption(str, Enum):
class IdeogramModelBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[Literal["ideogram"], Literal["api_key"]] = (
CredentialsField(
provider="ideogram",
supported_credential_types={"api_key"},
description="The Ideogram integration can be used with any API key with sufficient permissions for the blocks it is used on.",
)
credentials: CredentialsMetaInput[
Literal[ProviderName.IDEOGRAM], Literal["api_key"]
] = CredentialsField(
description="The Ideogram integration can be used with any API key with sufficient permissions for the blocks it is used on.",
)
prompt: str = SchemaField(
description="Text prompt for image generation",
@ -242,9 +245,8 @@ class IdeogramModelBlock(Block):
try:
response = requests.post(url, json=data, headers=headers)
response.raise_for_status()
return response.json()["data"][0]["url"]
except requests.exceptions.RequestException as e:
except RequestException as e:
raise Exception(f"Failed to fetch image: {str(e)}")
def upscale_image(self, api_key: SecretStr, image_url: str):
@ -256,7 +258,6 @@ class IdeogramModelBlock(Block):
try:
# Step 1: Download the image from the provided URL
image_response = requests.get(image_url)
image_response.raise_for_status()
# Step 2: Send the downloaded image to the upscale API
files = {
@ -272,8 +273,7 @@ class IdeogramModelBlock(Block):
files=files,
)
response.raise_for_status()
return response.json()["data"][0]["url"]
except requests.exceptions.RequestException as e:
except RequestException as e:
raise Exception(f"Failed to upscale image: {str(e)}")

View File

@ -2,13 +2,28 @@ from typing import Any
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.json import json
class StepThroughItemsBlock(Block):
class Input(BlockSchema):
items: list | dict = SchemaField(
items: list = SchemaField(
advanced=False,
description="The list or dictionary of items to iterate over",
placeholder="[1, 2, 3, 4, 5] or {'key1': 'value1', 'key2': 'value2'}",
default=[],
)
items_object: dict = SchemaField(
advanced=False,
description="The list or dictionary of items to iterate over",
placeholder="[1, 2, 3, 4, 5] or {'key1': 'value1', 'key2': 'value2'}",
default={},
)
items_str: str = SchemaField(
advanced=False,
description="The list or dictionary of items to iterate over",
placeholder="[1, 2, 3, 4, 5] or {'key1': 'value1', 'key2': 'value2'}",
default="",
)
class Output(BlockSchema):
@ -39,14 +54,20 @@ class StepThroughItemsBlock(Block):
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
items = input_data.items
if isinstance(items, dict):
# If items is a dictionary, iterate over its values
for item in items.values():
yield "item", item
yield "key", item
else:
# If items is a list, iterate over the list
for index, item in enumerate(items):
yield "item", item
yield "key", index
for data in [input_data.items, input_data.items_object, input_data.items_str]:
if not data:
continue
if isinstance(data, str):
items = json.loads(data)
else:
items = data
if isinstance(items, dict):
# If items is a dictionary, iterate over its values
for item in items.values():
yield "item", item
yield "key", item
else:
# If items is a list, iterate over the list
for index, item in enumerate(items):
yield "item", item
yield "key", index

View File

@ -1,13 +1,13 @@
from typing import Literal
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from backend.data.model import CredentialsField, CredentialsMetaInput
from backend.data.model import APIKeyCredentials, CredentialsField, CredentialsMetaInput
from backend.integrations.providers import ProviderName
JinaCredentials = APIKeyCredentials
JinaCredentialsInput = CredentialsMetaInput[
Literal["jina"],
Literal[ProviderName.JINA],
Literal["api_key"],
]
@ -18,8 +18,6 @@ def JinaCredentialsField() -> JinaCredentialsInput:
"""
return CredentialsField(
provider="jina",
supported_credential_types={"api_key"},
description="The Jina integration can be used with an API Key.",
)

View File

@ -1,5 +1,3 @@
import requests
from backend.blocks.jina._auth import (
JinaCredentials,
JinaCredentialsField,
@ -7,6 +5,7 @@ from backend.blocks.jina._auth import (
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class JinaChunkingBlock(Block):
@ -57,7 +56,6 @@ class JinaChunkingBlock(Block):
}
response = requests.post(url, headers=headers, json=data)
response.raise_for_status()
result = response.json()
all_chunks.extend(result.get("chunks", []))

View File

@ -1,5 +1,3 @@
import requests
from backend.blocks.jina._auth import (
JinaCredentials,
JinaCredentialsField,
@ -7,6 +5,7 @@ from backend.blocks.jina._auth import (
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class JinaEmbeddingBlock(Block):

View File

@ -0,0 +1,59 @@
from urllib.parse import quote
import requests
from backend.blocks.jina._auth import (
JinaCredentials,
JinaCredentialsField,
JinaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class FactCheckerBlock(Block):
class Input(BlockSchema):
statement: str = SchemaField(
description="The statement to check for factuality"
)
credentials: JinaCredentialsInput = JinaCredentialsField()
class Output(BlockSchema):
factuality: float = SchemaField(
description="The factuality score of the statement"
)
result: bool = SchemaField(description="The result of the factuality check")
reason: str = SchemaField(description="The reason for the factuality result")
error: str = SchemaField(description="Error message if the check fails")
def __init__(self):
super().__init__(
id="d38b6c5e-9968-4271-8423-6cfe60d6e7e6",
description="This block checks the factuality of a given statement using Jina AI's Grounding API.",
categories={BlockCategory.SEARCH},
input_schema=FactCheckerBlock.Input,
output_schema=FactCheckerBlock.Output,
)
def run(
self, input_data: Input, *, credentials: JinaCredentials, **kwargs
) -> BlockOutput:
encoded_statement = quote(input_data.statement)
url = f"https://g.jina.ai/{encoded_statement}"
headers = {
"Accept": "application/json",
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
}
response = requests.get(url, headers=headers)
response.raise_for_status()
data = response.json()
if "data" in data:
data = data["data"]
yield "factuality", data["factuality"]
yield "result", data["result"]
yield "reason", data["reason"]
else:
raise RuntimeError(f"Expected 'data' key not found in response: {data}")

View File

@ -0,0 +1,107 @@
from groq._utils._utils import quote
from backend.blocks.jina._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
JinaCredentials,
JinaCredentialsField,
JinaCredentialsInput,
)
from backend.blocks.search import GetRequest
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class SearchTheWebBlock(Block, GetRequest):
class Input(BlockSchema):
credentials: JinaCredentialsInput = JinaCredentialsField()
query: str = SchemaField(description="The search query to search the web for")
class Output(BlockSchema):
results: str = SchemaField(
description="The search results including content from top 5 URLs"
)
error: str = SchemaField(description="Error message if the search fails")
def __init__(self):
super().__init__(
id="87840993-2053-44b7-8da4-187ad4ee518c",
description="This block searches the internet for the given search query.",
categories={BlockCategory.SEARCH},
input_schema=SearchTheWebBlock.Input,
output_schema=SearchTheWebBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"query": "Artificial Intelligence",
},
test_credentials=TEST_CREDENTIALS,
test_output=("results", "search content"),
test_mock={"get_request": lambda *args, **kwargs: "search content"},
)
def run(
self, input_data: Input, *, credentials: JinaCredentials, **kwargs
) -> BlockOutput:
# Encode the search query
encoded_query = quote(input_data.query)
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
}
# Prepend the Jina Search URL to the encoded query
jina_search_url = f"https://s.jina.ai/{encoded_query}"
results = self.get_request(jina_search_url, headers=headers, json=False)
# Output the search results
yield "results", results
class ExtractWebsiteContentBlock(Block, GetRequest):
class Input(BlockSchema):
credentials: JinaCredentialsInput = JinaCredentialsField()
url: str = SchemaField(description="The URL to scrape the content from")
raw_content: bool = SchemaField(
default=False,
title="Raw Content",
description="Whether to do a raw scrape of the content or use Jina-ai Reader to scrape the content",
advanced=True,
)
class Output(BlockSchema):
content: str = SchemaField(description="The scraped content from the given URL")
error: str = SchemaField(
description="Error message if the content cannot be retrieved"
)
def __init__(self):
super().__init__(
id="436c3984-57fd-4b85-8e9a-459b356883bd",
description="This block scrapes the content from the given web URL.",
categories={BlockCategory.SEARCH},
input_schema=ExtractWebsiteContentBlock.Input,
output_schema=ExtractWebsiteContentBlock.Output,
test_input={
"url": "https://en.wikipedia.org/wiki/Artificial_intelligence",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=("content", "scraped content"),
test_mock={"get_request": lambda *args, **kwargs: "scraped content"},
)
def run(
self, input_data: Input, *, credentials: JinaCredentials, **kwargs
) -> BlockOutput:
if input_data.raw_content:
url = input_data.url
headers = {}
else:
url = f"https://r.jina.ai/{input_data.url}"
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
}
content = self.get_request(url, json=False, headers=headers)
yield "content", content

View File

@ -5,9 +5,10 @@ from json import JSONDecodeError
from types import MappingProxyType
from typing import TYPE_CHECKING, Any, List, Literal, NamedTuple
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from backend.integrations.providers import ProviderName
if TYPE_CHECKING:
from enum import _EnumMemberT
@ -17,24 +18,31 @@ import openai
from groq import Groq
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.util import json
from backend.util.settings import BehaveAs, Settings
from backend.util.text import TextFormatter
logger = logging.getLogger(__name__)
fmt = TextFormatter()
# LlmApiKeys = {
# "openai": BlockSecret("openai_api_key"),
# "anthropic": BlockSecret("anthropic_api_key"),
# "groq": BlockSecret("groq_api_key"),
# "ollama": BlockSecret(value=""),
# }
AICredentials = CredentialsMetaInput[Literal["llm"], Literal["api_key"]]
LLMProviderName = Literal[
ProviderName.ANTHROPIC,
ProviderName.GROQ,
ProviderName.OLLAMA,
ProviderName.OPENAI,
ProviderName.OPEN_ROUTER,
]
AICredentials = CredentialsMetaInput[LLMProviderName, Literal["api_key"]]
TEST_CREDENTIALS = APIKeyCredentials(
id="ed55ac19-356e-4243-a6cb-bc599e9b716f",
provider="llm",
provider="openai",
api_key=SecretStr("mock-openai-api-key"),
title="Mock OpenAI API key",
expires_at=None,
@ -50,15 +58,16 @@ TEST_CREDENTIALS_INPUT = {
def AICredentialsField() -> AICredentials:
return CredentialsField(
description="API key for the LLM provider.",
provider="llm",
supported_credential_types={"api_key"},
discriminator="model",
discriminator_mapping={
model.value: model.metadata.provider for model in LlmModel
},
)
class ModelMetadata(NamedTuple):
provider: str
context_window: int
cost_factor: int
class LlmModelMeta(EnumMeta):
@ -102,8 +111,29 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
LLAMA3_1_70B = "llama-3.1-70b-versatile"
LLAMA3_1_8B = "llama-3.1-8b-instant"
# Ollama models
OLLAMA_LLAMA3_2 = "llama3.2"
OLLAMA_LLAMA3_8B = "llama3"
OLLAMA_LLAMA3_405B = "llama3.1:405b"
OLLAMA_DOLPHIN = "dolphin-mistral:latest"
# OpenRouter models
GEMINI_FLASH_1_5_8B = "google/gemini-flash-1.5"
GROK_BETA = "x-ai/grok-beta"
MISTRAL_NEMO = "mistralai/mistral-nemo"
COHERE_COMMAND_R_08_2024 = "cohere/command-r-08-2024"
COHERE_COMMAND_R_PLUS_08_2024 = "cohere/command-r-plus-08-2024"
EVA_QWEN_2_5_32B = "eva-unit-01/eva-qwen-2.5-32b"
DEEPSEEK_CHAT = "deepseek/deepseek-chat"
PERPLEXITY_LLAMA_3_1_SONAR_LARGE_128K_ONLINE = (
"perplexity/llama-3.1-sonar-large-128k-online"
)
QWEN_QWQ_32B_PREVIEW = "qwen/qwq-32b-preview"
NOUSRESEARCH_HERMES_3_LLAMA_3_1_405B = "nousresearch/hermes-3-llama-3.1-405b"
NOUSRESEARCH_HERMES_3_LLAMA_3_1_70B = "nousresearch/hermes-3-llama-3.1-70b"
AMAZON_NOVA_LITE_V1 = "amazon/nova-lite-v1"
AMAZON_NOVA_MICRO_V1 = "amazon/nova-micro-v1"
AMAZON_NOVA_PRO_V1 = "amazon/nova-pro-v1"
MICROSOFT_WIZARDLM_2_8X22B = "microsoft/wizardlm-2-8x22b"
GRYPHE_MYTHOMAX_L2_13B = "gryphe/mythomax-l2-13b"
@property
def metadata(self) -> ModelMetadata:
@ -117,31 +147,47 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
def context_window(self) -> int:
return self.metadata.context_window
@property
def cost_factor(self) -> int:
return self.metadata.cost_factor
MODEL_METADATA = {
LlmModel.O1_PREVIEW: ModelMetadata("openai", 32000, cost_factor=16),
LlmModel.O1_MINI: ModelMetadata("openai", 62000, cost_factor=4),
LlmModel.GPT4O_MINI: ModelMetadata("openai", 128000, cost_factor=1),
LlmModel.GPT4O: ModelMetadata("openai", 128000, cost_factor=3),
LlmModel.GPT4_TURBO: ModelMetadata("openai", 128000, cost_factor=10),
LlmModel.GPT3_5_TURBO: ModelMetadata("openai", 16385, cost_factor=1),
LlmModel.CLAUDE_3_5_SONNET: ModelMetadata("anthropic", 200000, cost_factor=4),
LlmModel.CLAUDE_3_HAIKU: ModelMetadata("anthropic", 200000, cost_factor=1),
LlmModel.LLAMA3_8B: ModelMetadata("groq", 8192, cost_factor=1),
LlmModel.LLAMA3_70B: ModelMetadata("groq", 8192, cost_factor=1),
LlmModel.MIXTRAL_8X7B: ModelMetadata("groq", 32768, cost_factor=1),
LlmModel.GEMMA_7B: ModelMetadata("groq", 8192, cost_factor=1),
LlmModel.GEMMA2_9B: ModelMetadata("groq", 8192, cost_factor=1),
LlmModel.LLAMA3_1_405B: ModelMetadata("groq", 8192, cost_factor=1),
LlmModel.O1_PREVIEW: ModelMetadata("openai", 32000),
LlmModel.O1_MINI: ModelMetadata("openai", 62000),
LlmModel.GPT4O_MINI: ModelMetadata("openai", 128000),
LlmModel.GPT4O: ModelMetadata("openai", 128000),
LlmModel.GPT4_TURBO: ModelMetadata("openai", 128000),
LlmModel.GPT3_5_TURBO: ModelMetadata("openai", 16385),
LlmModel.CLAUDE_3_5_SONNET: ModelMetadata("anthropic", 200000),
LlmModel.CLAUDE_3_HAIKU: ModelMetadata("anthropic", 200000),
LlmModel.LLAMA3_8B: ModelMetadata("groq", 8192),
LlmModel.LLAMA3_70B: ModelMetadata("groq", 8192),
LlmModel.MIXTRAL_8X7B: ModelMetadata("groq", 32768),
LlmModel.GEMMA_7B: ModelMetadata("groq", 8192),
LlmModel.GEMMA2_9B: ModelMetadata("groq", 8192),
LlmModel.LLAMA3_1_405B: ModelMetadata("groq", 8192),
# Limited to 16k during preview
LlmModel.LLAMA3_1_70B: ModelMetadata("groq", 131072, cost_factor=1),
LlmModel.LLAMA3_1_8B: ModelMetadata("groq", 131072, cost_factor=1),
LlmModel.OLLAMA_LLAMA3_8B: ModelMetadata("ollama", 8192, cost_factor=1),
LlmModel.OLLAMA_LLAMA3_405B: ModelMetadata("ollama", 8192, cost_factor=1),
LlmModel.LLAMA3_1_70B: ModelMetadata("groq", 131072),
LlmModel.LLAMA3_1_8B: ModelMetadata("groq", 131072),
LlmModel.OLLAMA_LLAMA3_2: ModelMetadata("ollama", 8192),
LlmModel.OLLAMA_LLAMA3_8B: ModelMetadata("ollama", 8192),
LlmModel.OLLAMA_LLAMA3_405B: ModelMetadata("ollama", 8192),
LlmModel.OLLAMA_DOLPHIN: ModelMetadata("ollama", 32768),
LlmModel.GEMINI_FLASH_1_5_8B: ModelMetadata("open_router", 8192),
LlmModel.GROK_BETA: ModelMetadata("open_router", 8192),
LlmModel.MISTRAL_NEMO: ModelMetadata("open_router", 4000),
LlmModel.COHERE_COMMAND_R_08_2024: ModelMetadata("open_router", 4000),
LlmModel.COHERE_COMMAND_R_PLUS_08_2024: ModelMetadata("open_router", 4000),
LlmModel.EVA_QWEN_2_5_32B: ModelMetadata("open_router", 4000),
LlmModel.DEEPSEEK_CHAT: ModelMetadata("open_router", 8192),
LlmModel.PERPLEXITY_LLAMA_3_1_SONAR_LARGE_128K_ONLINE: ModelMetadata(
"open_router", 8192
),
LlmModel.QWEN_QWQ_32B_PREVIEW: ModelMetadata("open_router", 4000),
LlmModel.NOUSRESEARCH_HERMES_3_LLAMA_3_1_405B: ModelMetadata("open_router", 4000),
LlmModel.NOUSRESEARCH_HERMES_3_LLAMA_3_1_70B: ModelMetadata("open_router", 4000),
LlmModel.AMAZON_NOVA_LITE_V1: ModelMetadata("open_router", 4000),
LlmModel.AMAZON_NOVA_MICRO_V1: ModelMetadata("open_router", 4000),
LlmModel.AMAZON_NOVA_PRO_V1: ModelMetadata("open_router", 4000),
LlmModel.MICROSOFT_WIZARDLM_2_8X22B: ModelMetadata("open_router", 4000),
LlmModel.GRYPHE_MYTHOMAX_L2_13B: ModelMetadata("open_router", 4000),
}
for model in LlmModel:
@ -192,7 +238,9 @@ class AIStructuredResponseGeneratorBlock(Block):
description="Number of times to retry the LLM call if the response does not match the expected format.",
)
prompt_values: dict[str, str] = SchemaField(
advanced=False, default={}, description="Values used to fill in the prompt."
advanced=False,
default={},
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
max_tokens: int | None = SchemaField(
advanced=True,
@ -200,6 +248,12 @@ class AIStructuredResponseGeneratorBlock(Block):
description="The maximum number of tokens to generate in the chat completion.",
)
ollama_host: str = SchemaField(
advanced=True,
default="localhost:11434",
description="Ollama host for local models",
)
class Output(BlockSchema):
response: dict[str, Any] = SchemaField(
description="The response object generated by the language model."
@ -245,6 +299,7 @@ class AIStructuredResponseGeneratorBlock(Block):
prompt: list[dict],
json_format: bool,
max_tokens: int | None = None,
ollama_host: str = "localhost:11434",
) -> tuple[str, int, int]:
"""
Args:
@ -253,6 +308,7 @@ class AIStructuredResponseGeneratorBlock(Block):
prompt: The prompt to send to the LLM.
json_format: Whether the response should be in JSON format.
max_tokens: The maximum number of tokens to generate in the chat completion.
ollama_host: The host for ollama to use
Returns:
The response from the LLM.
@ -311,8 +367,15 @@ class AIStructuredResponseGeneratorBlock(Block):
max_tokens=max_tokens or 8192,
)
if not resp.content:
raise ValueError("No content returned from Anthropic.")
return (
resp.content[0].text if resp.content else "",
(
resp.content[0].name
if isinstance(resp.content[0], anthropic.types.ToolUseBlock)
else resp.content[0].text
),
resp.usage.input_tokens,
resp.usage.output_tokens,
)
@ -335,9 +398,10 @@ class AIStructuredResponseGeneratorBlock(Block):
response.usage.completion_tokens if response.usage else 0,
)
elif provider == "ollama":
client = ollama.Client(host=ollama_host)
sys_messages = [p["content"] for p in prompt if p["role"] == "system"]
usr_messages = [p["content"] for p in prompt if p["role"] != "system"]
response = ollama.generate(
response = client.generate(
model=llm_model.value,
prompt=f"{sys_messages}\n\n{usr_messages}",
stream=False,
@ -347,6 +411,34 @@ class AIStructuredResponseGeneratorBlock(Block):
response.get("prompt_eval_count") or 0,
response.get("eval_count") or 0,
)
elif provider == "open_router":
client = openai.OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key=credentials.api_key.get_secret_value(),
)
response = client.chat.completions.create(
extra_headers={
"HTTP-Referer": "https://agpt.co",
"X-Title": "AutoGPT",
},
model=llm_model.value,
messages=prompt, # type: ignore
max_tokens=max_tokens,
)
# If there's no response, raise an error
if not response.choices:
if response:
raise ValueError(f"OpenRouter error: {response}")
else:
raise ValueError("No response from OpenRouter.")
return (
response.choices[0].message.content or "",
response.usage.prompt_tokens if response.usage else 0,
response.usage.completion_tokens if response.usage else 0,
)
else:
raise ValueError(f"Unsupported LLM provider: {provider}")
@ -362,8 +454,8 @@ class AIStructuredResponseGeneratorBlock(Block):
values = input_data.prompt_values
if values:
input_data.prompt = input_data.prompt.format(**values)
input_data.sys_prompt = input_data.sys_prompt.format(**values)
input_data.prompt = fmt.format_string(input_data.prompt, values)
input_data.sys_prompt = fmt.format_string(input_data.sys_prompt, values)
if input_data.sys_prompt:
prompt.append({"role": "system", "content": input_data.sys_prompt})
@ -409,6 +501,7 @@ class AIStructuredResponseGeneratorBlock(Block):
llm_model=llm_model,
prompt=prompt,
json_format=bool(input_data.expected_format),
ollama_host=input_data.ollama_host,
max_tokens=input_data.max_tokens,
)
self.merge_stats(
@ -468,7 +561,7 @@ class AIStructuredResponseGeneratorBlock(Block):
class AITextGeneratorBlock(Block):
class Input(BlockSchema):
prompt: str = SchemaField(
description="The prompt to send to the language model.",
description="The prompt to send to the language model. You can use any of the {keys} from Prompt Values to fill in the prompt with values from the prompt values dictionary by putting them in curly braces.",
placeholder="Enter your prompt here...",
)
model: LlmModel = SchemaField(
@ -489,7 +582,14 @@ class AITextGeneratorBlock(Block):
description="Number of times to retry the LLM call if the response does not match the expected format.",
)
prompt_values: dict[str, str] = SchemaField(
advanced=False, default={}, description="Values used to fill in the prompt."
advanced=False,
default={},
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
ollama_host: str = SchemaField(
advanced=True,
default="localhost:11434",
description="Ollama host for local models",
)
max_tokens: int | None = SchemaField(
advanced=True,
@ -581,6 +681,11 @@ class AITextSummarizerBlock(Block):
description="The number of overlapping tokens between chunks to maintain context.",
ge=0,
)
ollama_host: str = SchemaField(
advanced=True,
default="localhost:11434",
description="Ollama host for local models",
)
class Output(BlockSchema):
summary: str = SchemaField(description="The final summary of the text.")
@ -719,6 +824,11 @@ class AIConversationBlock(Block):
default=None,
description="The maximum number of tokens to generate in the chat completion.",
)
ollama_host: str = SchemaField(
advanced=True,
default="localhost:11434",
description="Ollama host for local models",
)
class Output(BlockSchema):
response: str = SchemaField(
@ -816,6 +926,11 @@ class AIListGeneratorBlock(Block):
default=None,
description="The maximum number of tokens to generate in the chat completion.",
)
ollama_host: str = SchemaField(
advanced=True,
default="localhost:11434",
description="Ollama host for local models",
)
class Output(BlockSchema):
generated_list: List[str] = SchemaField(description="The generated list.")
@ -967,6 +1082,7 @@ class AIListGeneratorBlock(Block):
credentials=input_data.credentials,
model=input_data.model,
expected_format={}, # Do not use structured response
ollama_host=input_data.ollama_host,
),
credentials=credentials,
)

View File

@ -1,18 +1,19 @@
from enum import Enum
from typing import List, Literal
import requests
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import (
APIKeyCredentials,
BlockSecret,
CredentialsField,
CredentialsMetaInput,
SchemaField,
SecretField,
)
from backend.integrations.providers import ProviderName
from backend.util.request import requests
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
@ -77,12 +78,10 @@ class PublishToMediumBlock(Block):
description="Whether to notify followers that the user has published",
placeholder="False",
)
credentials: CredentialsMetaInput[Literal["medium"], Literal["api_key"]] = (
CredentialsField(
provider="medium",
supported_credential_types={"api_key"},
description="The Medium integration can be used with any API key with sufficient permissions for the blocks it is used on.",
)
credentials: CredentialsMetaInput[
Literal[ProviderName.MEDIUM], Literal["api_key"]
] = CredentialsField(
description="The Medium integration can be used with any API key with sufficient permissions for the blocks it is used on.",
)
class Output(BlockSchema):

View File

@ -0,0 +1,32 @@
from typing import Literal
from pydantic import SecretStr
from backend.data.model import APIKeyCredentials, CredentialsField, CredentialsMetaInput
from backend.integrations.providers import ProviderName
NvidiaCredentials = APIKeyCredentials
NvidiaCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.NVIDIA],
Literal["api_key"],
]
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="nvidia",
api_key=SecretStr("mock-nvidia-api-key"),
title="Mock Nvidia API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
def NvidiaCredentialsField() -> NvidiaCredentialsInput:
"""Creates an Nvidia credentials input on a block."""
return CredentialsField(description="The Nvidia integration requires an API Key.")

View File

@ -0,0 +1,90 @@
from backend.blocks.nvidia._auth import (
NvidiaCredentials,
NvidiaCredentialsField,
NvidiaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
class NvidiaDeepfakeDetectBlock(Block):
class Input(BlockSchema):
credentials: NvidiaCredentialsInput = NvidiaCredentialsField()
image_base64: str = SchemaField(
description="Image to analyze for deepfakes", image_upload=True
)
return_image: bool = SchemaField(
description="Whether to return the processed image with markings",
default=False,
)
class Output(BlockSchema):
status: str = SchemaField(
description="Detection status (SUCCESS, ERROR, CONTENT_FILTERED)",
default="",
)
image: str = SchemaField(
description="Processed image with detection markings (if return_image=True)",
default="",
image_output=True,
)
is_deepfake: float = SchemaField(
description="Probability that the image is a deepfake (0-1)",
default=0.0,
)
def __init__(self):
super().__init__(
id="8c7d0d67-e79c-44f6-92a1-c2600c8aac7f",
description="Detects potential deepfakes in images using Nvidia's AI API",
categories={BlockCategory.SAFETY},
input_schema=NvidiaDeepfakeDetectBlock.Input,
output_schema=NvidiaDeepfakeDetectBlock.Output,
)
def run(
self, input_data: Input, *, credentials: NvidiaCredentials, **kwargs
) -> BlockOutput:
url = "https://ai.api.nvidia.com/v1/cv/hive/deepfake-image-detection"
headers = {
"accept": "application/json",
"content-type": "application/json",
"Authorization": f"Bearer {credentials.api_key.get_secret_value()}",
}
image_data = f"data:image/jpeg;base64,{input_data.image_base64}"
payload = {
"input": [image_data],
"return_image": input_data.return_image,
}
try:
response = requests.post(url, headers=headers, json=payload)
response.raise_for_status()
data = response.json()
result = data.get("data", [{}])[0]
# Get deepfake probability from first bounding box if any
deepfake_prob = 0.0
if result.get("bounding_boxes"):
deepfake_prob = result["bounding_boxes"][0].get("is_deepfake", 0.0)
yield "status", result.get("status", "ERROR")
yield "is_deepfake", deepfake_prob
if input_data.return_image:
image_data = result.get("image", "")
output_data = f"data:image/jpeg;base64,{image_data}"
yield "image", output_data
else:
yield "image", ""
except Exception as e:
yield "error", str(e)
yield "status", "ERROR"
yield "is_deepfake", 0.0
yield "image", ""

View File

@ -1,26 +1,27 @@
from typing import Literal
import uuid
from typing import Any, Literal
from autogpt_libs.supabase_integration_credentials_store import APIKeyCredentials
from pinecone import Pinecone, ServerlessSpec
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
PineconeCredentials = APIKeyCredentials
PineconeCredentialsInput = CredentialsMetaInput[
Literal["pinecone"],
Literal[ProviderName.PINECONE],
Literal["api_key"],
]
def PineconeCredentialsField() -> PineconeCredentialsInput:
"""
Creates a Pinecone credentials input on a block.
"""
"""Creates a Pinecone credentials input on a block."""
return CredentialsField(
provider="pinecone",
supported_credential_types={"api_key"},
description="The Pinecone integration can be used with an API Key.",
)
@ -98,10 +99,14 @@ class PineconeQueryBlock(Block):
include_metadata: bool = SchemaField(
description="Whether to include metadata in the response", default=True
)
host: str = SchemaField(description="Host for pinecone")
host: str = SchemaField(description="Host for pinecone", default="")
idx_name: str = SchemaField(description="Index name for pinecone")
class Output(BlockSchema):
results: dict = SchemaField(description="Query results from Pinecone")
results: Any = SchemaField(description="Query results from Pinecone")
combined_results: Any = SchemaField(
description="Combined results from Pinecone"
)
def __init__(self):
super().__init__(
@ -119,13 +124,105 @@ class PineconeQueryBlock(Block):
credentials: APIKeyCredentials,
**kwargs,
) -> BlockOutput:
pc = Pinecone(api_key=credentials.api_key.get_secret_value())
idx = pc.Index(host=input_data.host)
results = idx.query(
namespace=input_data.namespace,
vector=input_data.query_vector,
top_k=input_data.top_k,
include_values=input_data.include_values,
include_metadata=input_data.include_metadata,
try:
# Create a new client instance
pc = Pinecone(api_key=credentials.api_key.get_secret_value())
# Get the index
idx = pc.Index(input_data.idx_name)
# Ensure query_vector is in correct format
query_vector = input_data.query_vector
if isinstance(query_vector, list) and len(query_vector) > 0:
if isinstance(query_vector[0], list):
query_vector = query_vector[0]
results = idx.query(
namespace=input_data.namespace,
vector=query_vector,
top_k=input_data.top_k,
include_values=input_data.include_values,
include_metadata=input_data.include_metadata,
).to_dict() # type: ignore
combined_text = ""
if results["matches"]:
texts = [
match["metadata"]["text"]
for match in results["matches"]
if match.get("metadata", {}).get("text")
]
combined_text = "\n\n".join(texts)
# Return both the raw matches and combined text
yield "results", {
"matches": results["matches"],
"combined_text": combined_text,
}
yield "combined_results", combined_text
except Exception as e:
error_msg = f"Error querying Pinecone: {str(e)}"
raise RuntimeError(error_msg) from e
class PineconeInsertBlock(Block):
class Input(BlockSchema):
credentials: PineconeCredentialsInput = PineconeCredentialsField()
index: str = SchemaField(description="Initialized Pinecone index")
chunks: list = SchemaField(description="List of text chunks to ingest")
embeddings: list = SchemaField(
description="List of embeddings corresponding to the chunks"
)
yield "results", results
namespace: str = SchemaField(
description="Namespace to use in Pinecone", default=""
)
metadata: dict = SchemaField(
description="Additional metadata to store with each vector", default={}
)
class Output(BlockSchema):
upsert_response: str = SchemaField(
description="Response from Pinecone upsert operation"
)
def __init__(self):
super().__init__(
id="477f2168-cd91-475a-8146-9499a5982434",
description="Upload data to a Pinecone index",
categories={BlockCategory.LOGIC},
input_schema=PineconeInsertBlock.Input,
output_schema=PineconeInsertBlock.Output,
)
def run(
self,
input_data: Input,
*,
credentials: APIKeyCredentials,
**kwargs,
) -> BlockOutput:
try:
# Create a new client instance
pc = Pinecone(api_key=credentials.api_key.get_secret_value())
# Get the index
idx = pc.Index(input_data.index)
vectors = []
for chunk, embedding in zip(input_data.chunks, input_data.embeddings):
vector_metadata = input_data.metadata.copy()
vector_metadata["text"] = chunk
vectors.append(
{
"id": str(uuid.uuid4()),
"values": embedding,
"metadata": vector_metadata,
}
)
idx.upsert(vectors=vectors, namespace=input_data.namespace)
yield "upsert_response", "successfully upserted"
except Exception as e:
error_msg = f"Error uploading to Pinecone: {str(e)}"
raise RuntimeError(error_msg) from e

View File

@ -115,7 +115,7 @@ class GetRedditPostsBlock(Block):
def get_posts(input_data: Input) -> Iterator[praw.reddit.Submission]:
client = get_praw(input_data.creds)
subreddit = client.subreddit(input_data.subreddit)
return subreddit.new(limit=input_data.post_limit)
return subreddit.new(limit=input_data.post_limit or 10)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
current_time = datetime.now(tz=timezone.utc)
@ -165,8 +165,10 @@ class PostRedditCommentBlock(Block):
def reply_post(creds: RedditCredentials, comment: RedditComment) -> str:
client = get_praw(creds)
submission = client.submission(id=comment.post_id)
comment = submission.reply(comment.comment)
return comment.id # type: ignore
new_comment = submission.reply(comment.comment)
if not new_comment:
raise ValueError("Failed to post comment.")
return new_comment.id
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "comment_id", self.reply_post(input_data.creds, input_data.data)

View File

@ -3,11 +3,17 @@ from enum import Enum
from typing import Literal
import replicate
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
@ -49,13 +55,11 @@ class ImageType(str, Enum):
class ReplicateFluxAdvancedModelBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[Literal["replicate"], Literal["api_key"]] = (
CredentialsField(
provider="replicate",
supported_credential_types={"api_key"},
description="The Replicate integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
)
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
description="The Replicate integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
)
prompt: str = SchemaField(
description="Text prompt for image generation",
@ -197,7 +201,7 @@ class ReplicateFluxAdvancedModelBlock(Block):
client = replicate.Client(api_token=api_key.get_secret_value())
# Run the model with additional parameters
output = client.run(
output: FileOutput | list[FileOutput] = client.run( # type: ignore This is because they changed the return type, and didn't update the type hint! It should be overloaded depending on the value of `use_file_output` to `FileOutput | list[FileOutput]` but it's `Any | Iterator[Any]`
f"{model_name}",
input={
"prompt": prompt,
@ -210,13 +214,21 @@ class ReplicateFluxAdvancedModelBlock(Block):
"output_quality": output_quality,
"safety_tolerance": safety_tolerance,
},
wait=False, # don't arbitrarily return data:octect/stream or sometimes url depending on the model???? what is this api
)
# Check if output is a list or a string and extract accordingly; otherwise, assign a default message
if isinstance(output, list) and len(output) > 0:
result_url = output[0] # If output is a list, get the first element
if isinstance(output[0], FileOutput):
result_url = output[0].url # If output is a list, get the first element
else:
result_url = output[
0
] # If output is a list and not a FileOutput, get the first element. Should never happen, but just in case.
elif isinstance(output, FileOutput):
result_url = output.url # If output is a FileOutput, use the url
elif isinstance(output, str):
result_url = output # If output is a string, use it directly
result_url = output # If output is a string (for some reason due to their janky type hinting), use it directly
else:
result_url = (
"No output received" # Fallback message if output is not as expected

View File

@ -1,20 +1,17 @@
from typing import Any, Literal
from typing import Literal
from urllib.parse import quote
import requests
from autogpt_libs.supabase_integration_credentials_store.types import APIKeyCredentials
from pydantic import SecretStr
from backend.blocks.helpers.http import GetRequest
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import CredentialsField, CredentialsMetaInput, SchemaField
class GetRequest:
@classmethod
def get_request(cls, url: str, json=False) -> Any:
response = requests.get(url)
response.raise_for_status()
return response.json() if json else response.text
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
class GetWikipediaSummaryBlock(Block, GetRequest):
@ -48,80 +45,6 @@ class GetWikipediaSummaryBlock(Block, GetRequest):
yield "summary", response["extract"]
class SearchTheWebBlock(Block, GetRequest):
class Input(BlockSchema):
query: str = SchemaField(description="The search query to search the web for")
class Output(BlockSchema):
results: str = SchemaField(
description="The search results including content from top 5 URLs"
)
error: str = SchemaField(description="Error message if the search fails")
def __init__(self):
super().__init__(
id="87840993-2053-44b7-8da4-187ad4ee518c",
description="This block searches the internet for the given search query.",
categories={BlockCategory.SEARCH},
input_schema=SearchTheWebBlock.Input,
output_schema=SearchTheWebBlock.Output,
test_input={"query": "Artificial Intelligence"},
test_output=("results", "search content"),
test_mock={"get_request": lambda url, json: "search content"},
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
# Encode the search query
encoded_query = quote(input_data.query)
# Prepend the Jina Search URL to the encoded query
jina_search_url = f"https://s.jina.ai/{encoded_query}"
# Make the request to Jina Search
response = self.get_request(jina_search_url, json=False)
# Output the search results
yield "results", response
class ExtractWebsiteContentBlock(Block, GetRequest):
class Input(BlockSchema):
url: str = SchemaField(description="The URL to scrape the content from")
raw_content: bool = SchemaField(
default=False,
title="Raw Content",
description="Whether to do a raw scrape of the content or use Jina-ai Reader to scrape the content",
advanced=True,
)
class Output(BlockSchema):
content: str = SchemaField(description="The scraped content from the given URL")
error: str = SchemaField(
description="Error message if the content cannot be retrieved"
)
def __init__(self):
super().__init__(
id="436c3984-57fd-4b85-8e9a-459b356883bd",
description="This block scrapes the content from the given web URL.",
categories={BlockCategory.SEARCH},
input_schema=ExtractWebsiteContentBlock.Input,
output_schema=ExtractWebsiteContentBlock.Output,
test_input={"url": "https://en.wikipedia.org/wiki/Artificial_intelligence"},
test_output=("content", "scraped content"),
test_mock={"get_request": lambda url, json: "scraped content"},
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
if input_data.raw_content:
url = input_data.url
else:
url = f"https://r.jina.ai/{input_data.url}"
content = self.get_request(url, json=False)
yield "content", content
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="openweathermap",
@ -143,10 +66,8 @@ class GetWeatherInformationBlock(Block, GetRequest):
description="Location to get weather information for"
)
credentials: CredentialsMetaInput[
Literal["openweathermap"], Literal["api_key"]
Literal[ProviderName.OPENWEATHERMAP], Literal["api_key"]
] = CredentialsField(
provider="openweathermap",
supported_credential_types={"api_key"},
description="The OpenWeatherMap integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
)

View File

@ -0,0 +1,70 @@
from enum import Enum
from typing import Literal
from pydantic import BaseModel, SecretStr
from backend.data.model import APIKeyCredentials, CredentialsField, CredentialsMetaInput
from backend.integrations.providers import ProviderName
Slant3DCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.SLANT3D], Literal["api_key"]
]
def Slant3DCredentialsField() -> Slant3DCredentialsInput:
return CredentialsField(description="Slant3D API key for authentication")
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="slant3d",
api_key=SecretStr("mock-slant3d-api-key"),
title="Mock Slant3D API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
class CustomerDetails(BaseModel):
name: str
email: str
phone: str
address: str
city: str
state: str
zip: str
country_iso: str = "US"
is_residential: bool = True
class Color(Enum):
WHITE = "white"
BLACK = "black"
class Profile(Enum):
PLA = "PLA"
PETG = "PETG"
class OrderItem(BaseModel):
# filename: str
file_url: str
quantity: str # String as per API spec
color: Color = Color.WHITE
profile: Profile = Profile.PLA
# image_url: str = ""
# sku: str = ""
class Filament(BaseModel):
filament: str
hexColor: str
colorTag: str
profile: str

View File

@ -0,0 +1,94 @@
from typing import Any, Dict
from backend.data.block import Block
from backend.util.request import requests
from ._api import Color, CustomerDetails, OrderItem, Profile
class Slant3DBlockBase(Block):
"""Base block class for Slant3D API interactions"""
BASE_URL = "https://www.slant3dapi.com/api"
def _get_headers(self, api_key: str) -> Dict[str, str]:
return {"api-key": api_key, "Content-Type": "application/json"}
def _make_request(self, method: str, endpoint: str, api_key: str, **kwargs) -> Dict:
url = f"{self.BASE_URL}/{endpoint}"
response = requests.request(
method=method, url=url, headers=self._get_headers(api_key), **kwargs
)
if not response.ok:
error_msg = response.json().get("error", "Unknown error")
raise RuntimeError(f"API request failed: {error_msg}")
return response.json()
def _check_valid_color(self, profile: Profile, color: Color, api_key: str) -> str:
response = self._make_request(
"GET",
"filament",
api_key,
params={"profile": profile.value, "color": color.value},
)
if profile == Profile.PLA:
color_tag = color.value
else:
color_tag = f"{profile.value.lower()}{color.value.capitalize()}"
valid_tags = [filament["colorTag"] for filament in response["filaments"]]
if color_tag not in valid_tags:
raise ValueError(
f"""Invalid color profile combination {color_tag}.
Valid colors for {profile.value} are:
{','.join([filament['colorTag'].replace(profile.value.lower(), '') for filament in response['filaments'] if filament['profile'] == profile.value])}
"""
)
return color_tag
def _convert_to_color(self, profile: Profile, color: Color, api_key: str) -> str:
return self._check_valid_color(profile, color, api_key)
def _format_order_data(
self,
customer: CustomerDetails,
order_number: str,
items: list[OrderItem],
api_key: str,
) -> list[dict[str, Any]]:
"""Helper function to format order data for API requests"""
orders = []
for item in items:
order_data = {
"email": customer.email,
"phone": customer.phone,
"name": customer.name,
"orderNumber": order_number,
"filename": item.file_url,
"fileURL": item.file_url,
"bill_to_street_1": customer.address,
"bill_to_city": customer.city,
"bill_to_state": customer.state,
"bill_to_zip": customer.zip,
"bill_to_country_as_iso": customer.country_iso,
"bill_to_is_US_residential": str(customer.is_residential).lower(),
"ship_to_name": customer.name,
"ship_to_street_1": customer.address,
"ship_to_city": customer.city,
"ship_to_state": customer.state,
"ship_to_zip": customer.zip,
"ship_to_country_as_iso": customer.country_iso,
"ship_to_is_US_residential": str(customer.is_residential).lower(),
"order_item_name": item.file_url,
"order_quantity": item.quantity,
"order_image_url": "",
"order_sku": "NOT_USED",
"order_item_color": self._convert_to_color(
item.profile, item.color, api_key
),
"profile": item.profile.value,
}
orders.append(order_data)
return orders

View File

@ -0,0 +1,85 @@
from typing import List
from backend.data.block import BlockOutput, BlockSchema
from backend.data.model import APIKeyCredentials, SchemaField
from ._api import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
Filament,
Slant3DCredentialsField,
Slant3DCredentialsInput,
)
from .base import Slant3DBlockBase
class Slant3DFilamentBlock(Slant3DBlockBase):
"""Block for retrieving available filaments"""
class Input(BlockSchema):
credentials: Slant3DCredentialsInput = Slant3DCredentialsField()
class Output(BlockSchema):
filaments: List[Filament] = SchemaField(
description="List of available filaments"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="7cc416f4-f305-4606-9b3b-452b8a81031c",
description="Get list of available filaments",
input_schema=self.Input,
output_schema=self.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"filaments",
[
{
"filament": "PLA BLACK",
"hexColor": "000000",
"colorTag": "black",
"profile": "PLA",
},
{
"filament": "PLA WHITE",
"hexColor": "ffffff",
"colorTag": "white",
"profile": "PLA",
},
],
)
],
test_mock={
"_make_request": lambda *args, **kwargs: {
"filaments": [
{
"filament": "PLA BLACK",
"hexColor": "000000",
"colorTag": "black",
"profile": "PLA",
},
{
"filament": "PLA WHITE",
"hexColor": "ffffff",
"colorTag": "white",
"profile": "PLA",
},
]
}
},
)
def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
try:
result = self._make_request(
"GET", "filament", credentials.api_key.get_secret_value()
)
yield "filaments", result["filaments"]
except Exception as e:
yield "error", str(e)
raise

Some files were not shown because too many files have changed in this diff Show More