2 private links
A CLI tool and an API for fetching data from Twitter for free! It is recommended to install the package globally, if you want to use it from the CLI. [...] User Timeline 'user' authentication (logging in) grants access to the following resources/actions: Tweet Details [...] The API_KEY generated by logging in is what allows Rettiwt-API to authenticate as a logged in user while interacting with the Twitter API ('user' authentication). [...] This API uses the cookies of a Twitter account to fetch data from Twitter and as such, there is always a chance (although a measly one) of getting the account banned by Twitter algorithm.
https://www.warp.dev/#Smart & Reliable With knowledge sharing tools, autocompletions, and fully integrated AI, Warp is a more intelligent terminal — out of the box. Warp is built with Rust, rendered with Metal, and optimized for performance. [...] “Warp includes so many things I can’t live without — from clicking where you want the cursor to go, to AI autocorrect.” [...] I love how I can navigate the terminal editor just like a code editor with all the Move/Move-Select keymap.
🚧 We're currently working on a large scale refactor which can be found on the https://github.com/documenso/documenso/tree/feat/refresh branch. Signing documents digitally is fast, easy and should be best practice for every document signed worldwide. [...] Contact us if you are interested in our Enterprise plan for large organizations that need extra flexibility and control. [...] We are still working on the publishing of docker images, in the meantime you can follow the steps below to create a production ready docker image. [...] For local docker run docker run -it documenso:latest npm run start -- -H :: For k8s or docker-compose containers: - name: documenso image: documenso:latest imagePullPolicy: IfNotPresent command: - npm args: - run - start - -- - "::"
使用方式: 打开脚本https://greasyfork.org/zh-CN/scripts/470359-twitter-block-porn, 安装脚本.
Unified Model Serving Framework
🏹 Scalable with powerful performance optimizations
abstraction scales model inference separately from your custom code and multi-core CPU utilization with automatic provisioning [...]
We strip out as much potentially sensitive information as possible, and we will never collect user code, model data, model names, or stack traces.