Share and discover more about AI with social posts from the community.huggingface/OpenAi
hashbrown
Build Status Crates.io Documentation Rust

This crate is a Rust port of Google's high-performance SwissTable hash map, adapted to make it a drop-in replacement for Rust's standard HashMap and HashSet types.

The original C++ version of SwissTable can be found here, and this CppCon talk gives an overview of how the algorithm works.

Since Rust 1.36, this is now the HashMap implementation for the Rust standard library. However you may still want to use this crate instead since it works in environments without std, such as embedded systems and kernels.

Change log
Features
Drop-in replacement for the standard library HashMap and HashSet types.
Uses AHash as the default hasher, which is much faster than SipHash. However, AHash does not provide the same level of HashDoS resistance as SipHash, so if that is important to you, you might want to consider using a different hasher.
Around 2x faster than the previous standard library HashMap.
Lower memory usage: only 1 byte of overhead per entry instead of 8.
Compatible with #[no_std] (but requires a global allocator with the alloc crate).
Empty hash maps do not allocate any memory.
SIMD lookups to scan multiple hash entries in parallel.
https://crates.io/crates/hashbrown crates.io: Rust Package Registry
cfg-if
Documentation

A macro to ergonomically define an item depending on a large number of #[cfg] parameters. Structured like an if-else chain, the first matching branch is the item that gets emitted.

[dependencies]
cfg-if = "0.1"
Example
cfg_if::cfg_if! {
if #[cfg(unix)] {
fn foo() { /* unix specific functionality */ }
} else if #[cfg(target_pointer_width = "32")] {
fn foo() { /* non-unix, 32-bit functionality */ }
} else {
fn foo() { /* fallback implementation */ }
}
}

fn main() {
foo();
}
License
This project is licensed under either of

Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
https://crates.io/crates/cfg-if
libc - Raw FFI bindings to platforms' system libraries
GHA Status Cirrus CI Status Latest Version Documentation License

libc provides all of the definitions necessary to easily interoperate with C code (or "C-like" code) on each of the platforms that Rust supports. This includes type definitions (e.g. c_int), constants (e.g. EINVAL) as well as function headers (e.g. malloc).

This crate exports all underlying platform types, functions, and constants under the crate root, so all items are accessible as libc::foo. The types and values of all the exported APIs match the platform that libc is compiled for.

More detailed information about the design of this library can be found in its associated RFC.
https://crates.io/crates/libc crates.io: Rust Package Registry
bitflags
Rust Latest version Documentation License

bitflags generates flags enums with well-defined semantics and ergonomic end-user APIs.

You can use bitflags to:

provide more user-friendly bindings to C APIs where flags may or may not be fully known in advance.
generate efficient options types with string parsing and formatting support.
You can't use bitflags to:

guarantee only bits corresponding to defined flags will ever be set. bitflags allows access to the underlying bits type so arbitrary bits may be set.

define bitfields. bitflags only generates types where set bits denote the presence of some combination of flags.

Documentation

Specification

Release noteshttps://crates.io/crates/bitflags
Rust Quasi-Quoting
github crates.io docs.rs build status

This crate provides the quote! macro for turning Rust syntax tree data structures into tokens of source code.

Procedural macros in Rust receive a stream of tokens as input, execute arbitrary Rust code to determine how to manipulate those tokens, and produce a stream of tokens to hand back to the compiler to compile into the caller's crate. Quasi-quoting is a solution to one piece of that — producing tokens to return to the compiler.

The idea of quasi-quoting is that we write code that we treat as data. Within the quote! macro, we can write what looks like code to our text editor or IDE. We get all the benefits of the editor's brace matching, syntax highlighting, indentation, and maybe autocompletion. But rather than compiling that as code into the current crate, we can treat it as data, pass it around, mutate it, and eventually hand it back to the compiler as tokens to compile into the macro caller's crate.

This crate is motivated by the procedural macro use case, but is a general-purpose Rust quasi-quoting library and is not specific to procedural macros.https://crates.io/crates/quote crates.io: Rust Package Registry
proc-macro2
github crates.io docs.rs build status

A wrapper around the procedural macro API of the compiler's proc_macro crate. This library serves two purposes:

Bring proc-macro-like functionality to other contexts like build.rs and main.rs. Types from proc_macro are entirely specific to procedural macros and cannot ever exist in code outside of a procedural macro. Meanwhile proc_macro2 types may exist anywhere including non-macro code. By developing foundational libraries like syn and quote against proc_macro2 rather than proc_macro, the procedural macro ecosystem becomes easily applicable to many other use cases and we avoid reimplementing non-macro equivalents of those libraries.

Make procedural macros unit testable. As a consequence of being specific to procedural macros, nothing that uses proc_macro can be executed from a unit test. In order for helper libraries or components of a macro to be testable in isolation, they must be implemented using proc_macro2.https://crates.io/crates/proc-macro2 crates.io: Rust Package Registry
Parser for Rust source code
github crates.io docs.rs build status

Syn is a parsing library for parsing a stream of Rust tokens into a syntax tree of Rust source code.

Currently this library is geared toward use in Rust procedural macros, but contains some APIs that may be useful more generally.

Data structures — Syn provides a complete syntax tree that can represent any valid Rust source code. The syntax tree is rooted at syn::File which represents a full source file, but there are other entry points that may be useful to procedural macros including syn::Item, syn::Expr and syn::Type.

Derives — Of particular interest to derive macros is syn::DeriveInput which is any of the three legal input items to a derive macro. An example below shows using this type in a library that can derive implementations of a user-defined trait.

Parsing — Parsing in Syn is built around parser functions with the signature fn(ParseStream) -> Result<T>. Every syntax tree node defined by Syn is individually parsable and may be used as a building block for custom syntaxes, or you may dream up your own brand new syntax without involving any of our syntax tree types.

Location information — Every token parsed by Syn is associated with a Span that tracks line and column information back to the source of that token. These spans allow a procedural macro to display detailed error messages pointing to all the right places in the user's code. There is an example of this below.

Feature flags — Functionality is aggressively feature gated so your procedural macros enable only what they need, and do not pay in compile time for all the rest.

Version requirement: Syn supports rustc 1.61 and up.

Release notes
https://crates.io/crates/syn crates.io: Rust Package Registry
multitag
multitag is a Rust crate for reading and writing music metadata in a variety of formats. It aims to fix some of the issues present in audiotag, such as adding wav file support.

It currently supports reading and writing metadata to mp3, wav, aiff, flac, and mp4/m4a/... files, with support for more formats on the way.https://crates.io/crates/multitag crates.io: Rust Package Registry
Embedded Heatshrink
This library is a rewrite/port of the C library heatshrink. It has the same sink/poll API as the original library, but it is written in Rust. It is faster because of some optimizations for pushing bits and array manipulation. It fixes some bugs found during fuzzing.

Key Features
Low memory usage (as low as 50 bytes) It is useful for some cases with less than 50 bytes, and useful for many general cases with < 300 bytes.
Incremental, bounded CPU use You can chew on input data in arbitrarily tiny bites. This is a useful property in hard real-time environments.
Usage
This is an example pulled from the library that uses the streaming API to one-shot compress. If you want to stream continuously, then you reuse the same HeatshrinkEncoder instance. The HeatshrinkDecoder is the same.

https://crates.io/crates/embedded-heatshrink crates.io: Rust Package Registry
pfxers
pfxers allows you to look into PFX or PEM files, display their properties and copy their contents (certificate, certificate chains, key).

Cargo
Crates.io

cargo install pfxers --locked
Usage Examples
Basic usage:

pfxers certificate.crt
Using a password protected PFX file:

pfxers password-protected.pfx --password 'thisissecret'
Using a password protected PFX file, the password being in a file:

pfxers password-protected.pfx --password-file password.txt
Command Reference
Usage: pfxers [OPTIONS] <INPUT>

Arguments:
<INPUT> The PFX/PKCS12/pem file to inspect

Options:
--password-file <PASSWORD_FILE>
The file containing the password of the PFX/PKCS12 file
--password <PASSWORD>
The password of the PFX/PKCS12 file You should prefer the use of --password-file or
use the PFX_PASSWORD environment variable [env: PFX_PASSWORD=]
-h, --help
Print help
-V, --version
Print version
License
This project is licensed under either of

Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.

Copyright 2024 pfxers Contributors

Contribution
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.
https://crates.io/crates/pfxers
eSQL
GitHub branch status GitHub License Crates.io Version

Enhanced SQL experience for your Rust project. The crate contains some lightweight utilities that can be used with tokio-postgres and mysql_async.

Dynamic SQL builder
TODO: documentation

Migrations
TODO: documentation
https://crates.io/crates/esql crates.io: Rust Package Registry
actix-cloud
Actix Cloud is an all-in-one web framework based on Actix Web.

Features
Actix Cloud is highly configurable. You can only enable needed features, implement your own feature backend or even use other libraries.

logger (Default: Enable)
i18n (Default: Disable)
security (Embedded)
memorydb backend
default (Embedded)
redis (Default: Disable)
auth (Embedded)
session (Default: Disable)
Guide
Quick Start
You can refer to Hello world example for basic usage.

Application
Since application configuration can be quite dynamic, you need to build on your own. Here are some useful middlewares:

App::new()
.wrap(middleware::Compress::default()) // compress page
.wrap(SecurityHeader::default().build()) // default security header
.wrap(SessionMiddleware::builder(memorydb.clone(), Key::generate()).build()) // session
...
.app_data(state_cloned.clone())
logger
We use tracing as our logger library. It is thread safe. You can use it everywhere.

Start logger:

LoggerBuilder::new().level(Level::DEBUG).start() // colorful output
LoggerBuilder::new().json().start() // json output
You can also customize the logger with filter, transformer, etc.

Reinit logger (e.g., in plugins), or manually send logs:

logger.init(LoggerBuilder::new());
logger.sender().send(...);
i18n
We use rust-i18n-support from rust-i18n as our i18n core.

Load locale:

let mut locale = Locale::new(String::from("en-US"));
locale.add_locale(i18n!("locale"));
Translate:

t!(locale, "hello.world")
t!(locale, "hello.name", name = "MEME")
See examples for more usage.

security
Middleware to add security headers:

app.wrap(SecurityHeader::default().build())
Default header:

X-Content-Type-Options: nosniff
Referrer-Policy: strict-origin-when-cross-origin
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block
Cross-Origin-Opener-Policy: same-origin
Content-Security-Policy: default-src 'none'; script-src 'none'; object-src 'none'; base-uri 'none'; form-action 'none'; frame-ancestors 'none'
Enable HSTS when using HTTPS:

security_header.set_default_hsts();
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
memorydb-default
Actix Cloud has a default memory database backend used for sessions. You can also use your own backend if you implement actix_cloud::memorydb::MemoryDB.

Note: the default backend does not have memory limitation, DDoS is possible if gateway rate limiting is not implemented

DefaultBackend::new().await.unwrap()
memorydb-redis
Redis can be used as another backend for memory database.

RedisBackend::new("redis://user:[email protected]:6379/0").await.unwrap(),
auth
Authentication is quite simple, you only need to implement an extractor and a checker.

Extractor is used to extract your own authentication type from request. For example, assume we use 0 for guest and 1 for admin. Our authentication type is just Vec<u32>:

fn perm_extractor(req: &mut ServiceRequest) -> Vec<u32> {
let mut ret = Vec::new();
ret.push(0); // guest permission is assigned by default.

// test if query string has admin=1.
let qs = QString::from(req.query_string());
if qs.get("admin").is_some_and(|x| x == "1") {
ret.push(1);
}
ret
}
Checker is used to check the permission, the server will return 403 if the return value is false:

fn is_guest(p: Vec<u32>) -> bool {
p.into_iter().find(|x| *x == 0).is_some()
}
Then build the Router and configure in the App using build_router:

app.service(scope("/api").configure(build_router(...)))
session
Most features and usages are based on actix-session. Except for these:

MemoryDB is the only supported storage.
Error uses actix-cloud::error::Error.
You can set _ttl in the session to override the TTL of the session.
app.wrap(SessionMiddleware::builder(memorydb.clone(), Key::generate()).build())
License
This project is licensed under the MIT license.
https://crates.io/crates/actix-cloud crates.io: Rust Package Registry
box2d_sys
This crate contains Rust bindings to Box2D v3.0.

This crate implements certain traits for Box2D types, such as Hash, PartialEq, etc. We add these whenever they are needed for our projects which use this crate.

License
MIT or Apache2 at your option, but note that Box2D itself is licensed under MIT
https://crates.io/crates/box2d_sys crates.io: Rust Package Registry
print_raster
v0.1.0
A crate for processing print raster images in Rust
#print #raster #cups #pwg #urf

print_raster
crates.io Released API docs BSD 3 Clause licensed

A crate for processing print raster images in Rust.

Supported Formats
URF (Apple Raster)
CUPS Raster V1
CUPS Raster V2, including PWG Raster (a subset of CUPS Raster V2)
CUPS Raster V3
Features
Fully Asynchronous I/O
Relatively low-level API
Development
You can run unit tests, integration tests, and documentation tests with the following command:

cargo test
For fuzz testing, it's a bit more complicated. You need to use the honggfuzz tool, which only works on a few platforms. See here to set it up.

After setting up honggfuzz, you can run a fuzz target:

cargo hfuzz run <fuzz_target>
https://crates.io/crates/print_raster crates.io: Rust Package Registry
The 🤗 Machine Learning for 3D Course

Sign up
To receive updates as the course releases, sign up for the course mailing list here.

Overview
In this course, you’ll learn:

What’s going on - the current big picture of machine learning for 3D
Why it matters - the importance of recent developments
How to do it yourself - build your own generative 3D demo
https://huggingface.co/learn/ml-for-3d-course/unit0/introduction
Hugging Face Diffusion Models Course
In this free course, you will:

👩‍🎓 Study the theory behind diffusion models
🧨 Learn how to generate images and audio with the popular 🤗 Diffusers library
🏋️‍♂️ Train your own diffusion models from scratch
📻 Fine-tune existing diffusion models on new datasets
🗺 Explore conditional generation and guidance
🧑‍🔬 Create your own custom diffusion model pipelines
Prerequisites
This course requires a good level in Python and a grounding in deep learning and Pytorch. If it’s not the case yet, you can check these free resources:

Python: https://www.udacity.com/course/introduction-to-python—ud1110
Intro to Deep Learning with PyTorch: https://www.udacity.com/course/deep-learning-pytorch—ud188
PyTorch in 60min: https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html
To upload your models to the Hugging Face Hub, you’ll need an account. You can create one for free at the following address: https://huggingface.co/join.https://huggingface.co/learn/diffusion-course/unit0/1
the 🤗 Machine Learning for Games Course
Thumbnail
Welcome to the course that will teach you the most fascinating topic in game development: how to use powerful AI tools and models to create unique game experiences.

New AI models are revolutionizing the Game Industry in two impactful ways:

On how we make games:

Generate textures using AI
Using AI voice actors for the voices.
How we create gameplay:

Crafting smart Non-Playable Characters (NPCs) using large language models.
This course will teach you:

How to integrate AI models for innovative gameplay, featuring intelligent NPCs.
How to use AI tools to help your game development pipeline.

https://huggingface.co/learn/ml-games-course/unit0/introduction Welcome to the 🤗 Machine Learning for Games Course - Hugging Face ML for Games Course
Open-Source AI Cookbook
The Open-Source AI Cookbook is a collection of notebooks illustrating practical aspects of building AI applications and solving various machine learning tasks using open-source tools and models.

Latest notebooks
Check out the recently added notebooks:

Building RAG with Custom Unstructured Data
Agentic RAG: turbocharge your RAG with query reformulation and self-query! 🚀
Create a Transformers Agent from any LLM inference provider
Fine-tuning LLM to Generate Persian Product Catalogs in JSON Format
Agent for text-to-SQL with automatic error correction
Information Extraction with Haystack and NuExtract
RAG with Hugging Face and Milvus
Data analyst agent: get your data’s insights in the blink of an eye
Code Search with Vector Embeddings and Qdrant
RAG backed by SQL and Jina Reranker
You can also check out the notebooks in the cookbook’s GitHub repo.https://huggingface.co/learn/cookbook/index Open-Source AI Cookbook - Hugging Face Open-Source AI Cookbook
the Hugging Face Audio course!
Dear learner,

Welcome to this course on using transformers for audio. Time and again transformers have proven themselves as one of the most powerful and versatile deep learning architectures, capable of achieving state-of-the-art results in a wide range of tasks, including natural language processing, computer vision, and more recently, audio processing.

In this course, we will explore how transformers can be applied to audio data. You’ll learn how to use them to tackle a range of audio-related tasks. Whether you are interested in speech recognition, audio classification, or generating speech from text, transformers and this course have got you covered.

To give you a taste of what these models can do, say a few words in the demo below and watch the model transcribe it in real-time!

https://huggingface.co/learn/audio-course/chapter0/introduction Welcome to the Hugging Face Audio course! - Hugging Face Audio Course