Skip to content

Commit 47c4c29

Browse files
committed
Changed name from RedisAI to redis-inference-optimization, and updated README
1 parent 2fc312d commit 47c4c29

File tree

1 file changed

+37
-40
lines changed

1 file changed

+37
-40
lines changed

README.md

Lines changed: 37 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -1,43 +1,44 @@
1-
[![GitHub issues](https://img.shields.io/github/release/RedisAI/RedisAI.svg?sort=semver)](https://github.com/RedisAI/RedisAI/releases/latest)
2-
[![CircleCI](https://circleci.com/gh/RedisAI/RedisAI/tree/master.svg?style=svg)](https://circleci.com/gh/RedisAI/RedisAI/tree/master)
3-
[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredisai-blue)](https://hub.docker.com/r/redislabs/redisai/tags/)
4-
[![codecov](https://codecov.io/gh/RedisAI/RedisAI/branch/master/graph/badge.svg)](https://codecov.io/gh/RedisAI/RedisAI)
5-
[![Total alerts](https://img.shields.io/lgtm/alerts/g/RedisAI/RedisAI.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/RedisAI/RedisAI/alerts/)
6-
[![Forum](https://img.shields.io/badge/Forum-RedisAI-blue)](https://forum.redislabs.com/c/modules/redisai)
1+
[![GitHub issues](https://img.shields.io/github/release/redis-inference-optimization/redis-inference-optimization.svg?sort=semver)](https://github.com/redis-inference-optimization/redis-inference-optimization/releases/latest)
2+
[![CircleCI](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master.svg?style=svg)](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master)
3+
[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredis-inference-optimization-blue)](https://hub.docker.com/r/redislabs/redis-inference-optimization/tags/)
4+
[![codecov](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization/branch/master/graph/badge.svg)](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization)
5+
[![Total alerts](https://img.shields.io/lgtm/alerts/g/redis-inference-optimization/redis-inference-optimization.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/redis-inference-optimization/redis-inference-optimization/alerts/)
6+
[![Forum](https://img.shields.io/badge/Forum-redis-inference-optimization-blue)](https://forum.redislabs.com/c/modules/redis-inference-optimization)
77
[![Discord](https://img.shields.io/discord/697882427875393627?style=flat-square)](https://discord.gg/rTQm7UZ)
88

99
> [!CAUTION]
10-
> **RedisAI is no longer actively maintained or supported.**
10+
> **redis-inference-optimization is no longer actively maintained or supported.**
1111
>
12-
> We are grateful to the RedisAI community for their interest and support.
12+
> We are grateful to the redis-inference-optimization community for their interest and support.
13+
> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings.
1314
14-
# RedisAI
15-
RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
15+
# redis-inference-optimization
16+
Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.
1617

17-
To read RedisAI docs, visit [redisai.io](https://oss.redis.com/redisai/). To see RedisAI in action, visit the [demos page](https://oss.redis.com/redisai/examples/).
18+
To see redis-inference-optimization in action, visit the [demos page](https://oss.redis.com/redis-inference-optimization/examples/).
1819

1920
# Quickstart
20-
RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
21+
redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.
2122

22-
The following sections describe how to get started with RedisAI.
23+
The following sections describe how to get started with redis-inference-optimization.
2324

2425
## Docker
25-
The quickest way to try RedisAI is by launching its official Docker container images.
26+
The quickest way to try redis-inference-optimization is by launching its official Docker container images.
2627
### On a CPU only machine
2728
```
28-
docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic
29+
docker run -p 6379:6379 redislabs/redis-inference-optimization:1.2.7-cpu-bionic
2930
```
3031

3132
### On a GPU machine
3233
For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker)
3334

3435
```
35-
docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic
36+
docker run -p 6379:6379 --gpus all -it --rm redislabs/redis-inference-optimization:1.2.7-gpu-bionic
3637
```
3738

3839

3940
## Building
40-
You can compile and build the module from its source code. The [Developer](https://oss.redis.com/redisai/developer/) page has more information about the design and implementation of the RedisAI module and how to contribute.
41+
You can compile and build the module from its source code.
4142

4243
### Prerequisites
4344
* Packages: git, python3, make, wget, g++/clang, & unzip
@@ -49,17 +50,17 @@ You can compile and build the module from its source code. The [Developer](https
4950
You can obtain the module's source code by cloning the project's repository using git like so:
5051

5152
```sh
52-
git clone --recursive https://github.com/RedisAI/RedisAI
53+
git clone --recursive https://github.com/redis-inference-optimization/redis-inference-optimization
5354
```
5455

5556
Switch to the project's directory with:
5657

5758
```sh
58-
cd RedisAI
59+
cd redis-inference-optimization
5960
```
6061

6162
### Building the Dependencies
62-
Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:
63+
Use the following script to download and build the libraries of the various redis-inference-optimization backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:
6364

6465
```sh
6566
bash get_deps.sh
@@ -72,14 +73,14 @@ bash get_deps.sh gpu
7273
```
7374

7475
### Building the Module
75-
Once the dependencies have been built, you can build the RedisAI module with:
76+
Once the dependencies have been built, you can build the redis-inference-optimization module with:
7677

7778
```sh
7879
make -C opt clean ALL=1
7980
make -C opt
8081
```
8182

82-
Alternatively, run the following to build RedisAI with GPU support:
83+
Alternatively, run the following to build redis-inference-optimization with GPU support:
8384

8485
```sh
8586
make -C opt clean ALL=1
@@ -88,40 +89,39 @@ make -C opt GPU=1
8889

8990
### Backend Dependancy
9091

91-
RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.
92+
redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with.
9293

9394

94-
| RedisAI | PyTorch | TensorFlow | TFLite | ONNXRuntime |
95+
| redis-inference-optimization | PyTorch | TensorFlow | TFLite | ONNXRuntime |
9596
|:--------|:--------:|:----------:|:------:|:-----------:|
9697
| 1.0.3 | 1.5.0 | 1.15.0 | 2.0.0 | 1.2.0 |
9798
| 1.2.7 | 1.11.0 | 2.8.0 | 2.0.0 | 1.11.1 |
9899
| master | 1.11.0 | 2.8.0 | 2.0.0 | 1.11.1 |
99100

100-
Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](http://dev.cto.redis.s3.amazonaws.com/RedisAI/test_data/tf2-minimal.py
101-
) to see how to export a frozen graph from Keras and TensorFlow 2.x.
101+
Note: Keras and TensorFlow 2.x are supported through graph freezing.
102102

103103
## Loading the Module
104104
To load the module upon starting the Redis server, simply use the `--loadmodule` command line switch, the `loadmodule` configuration directive or the [Redis `MODULE LOAD` command](https://redis.io/commands/module-load) with the path to module's library.
105105

106106
For example, to load the module from the project's path with a server command line switch use the following:
107107

108108
```sh
109-
redis-server --loadmodule ./install-cpu/redisai.so
109+
redis-server --loadmodule ./install-cpu/redis-inference-optimization.so
110110
```
111111

112112
### Give it a try
113113

114-
Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redisai/intro/#getting-started).
114+
Once loaded, you can interact with redis-inference-optimization using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redis-inference-optimization/intro/#getting-started).
115115

116116
### Client libraries
117-
Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:
117+
Some languages already have client libraries that provide support for redis-inference-optimization's commands. The following table lists the known ones:
118118

119119
| Project | Language | License | Author | URL |
120120
| ------- | -------- | ------- | ------ | --- |
121-
| JRedisAI | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/JRedisAI) |
122-
| redisai-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-py) |
123-
| redisai-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-go) |
124-
| redisai-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-js) |
121+
| Jredis-inference-optimization | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/Jredis-inference-optimization) |
122+
| redis-inference-optimization-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-py) |
123+
| redis-inference-optimization-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-go) |
124+
| redis-inference-optimization-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-js) |
125125
| redis-modules-sdk | TypeScript | BSD-3-Clause | [Dani Tseitlin](https://github.com/danitseitlin) | [Github](https://github.com/danitseitlin/redis-modules-sdk) |
126126
| redis-modules-java | Java | Apache-2.0 | [dengliming](https://github.com/dengliming) | [Github](https://github.com/dengliming/redis-modules-java) |
127127
| smartredis | C++ | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) |
@@ -131,16 +131,13 @@ Some languages already have client libraries that provide support for RedisAI's
131131

132132

133133

134-
The full documentation for RedisAI's API can be found at the [Commands page](commands.md).
135-
136-
## Documentation
137-
Read the docs at [redisai.io](https://oss.redis.com/redisai/).
134+
The full documentation for redis-inference-optimization's API can be found at the [Commands page](commands.md).
138135

139136
## Contact Us
140137
If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you:
141138

142-
* [Forum](https://forum.redis.com/c/modules/redisai)
143-
* [Repository](https://github.com/RedisAI/RedisAI/issues)
139+
* [Forum](https://forum.redis.com/c/modules/redis-inference-optimization)
140+
* [Repository](https://github.com/redis-inference-optimization/redis-inference-optimization/issues)
144141

145142
## License
146-
RedisAI is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).
143+
redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).

0 commit comments

Comments
 (0)