Skip to content

Commit 50bf219

Browse files
chore(deps): update container image docker.io/localai/localai to v2.9.0 by renovate (#18546)
This PR contains the following updates: | Package | Update | Change | |---|---|---| | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.8.2-cublas-cuda11-ffmpeg-core` -> `v2.9.0-cublas-cuda11-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.8.2-cublas-cuda11-core` -> `v2.9.0-cublas-cuda11-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.8.2-cublas-cuda12-ffmpeg-core` -> `v2.9.0-cublas-cuda12-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.8.2-cublas-cuda12-core` -> `v2.9.0-cublas-cuda12-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.8.2-ffmpeg-core` -> `v2.9.0-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.8.2` -> `v2.9.0` | --- > [!WARNING] > Some dependencies could not be looked up. Check the Dependency Dashboard for more information. --- ### Release Notes <details> <summary>mudler/LocalAI (docker.io/localai/localai)</summary> ### [`v2.9.0`](https://github.com/mudler/LocalAI/releases/tag/v2.9.0) [Compare Source](https://github.com/mudler/LocalAI/compare/v2.8.2...v2.9.0) This release brings many enhancements, fixes, and a special thanks to the community for the amazing work and contributions! We now have sycl images for Intel GPUs, ROCm images for AMD GPUs,and much more: - You can find the AMD GPU images tags between the container images available - look for `hipblas`. For example, [master-hipblas-ffmpeg-core](https://quay.io/repository/go-skynet/local-ai/tag/master-hipblas-ffmpeg-core). Thanks to [@&#8203;fenfir](https://github.com/fenfir) for this nice contribution! - Intel GPU images are tagged with `sycl`. You can find images with two flavors, sycl-f16 and sycl-f32 respectively. For example, [master-sycl-f16](https://quay.io/repository/go-skynet/local-ai/tag/master-sycl-f16-core). Work is in progress to support also diffusers and transformers on Intel GPUs. - Thanks to [@&#8203;christ66](https://github.com/christ66) first efforts in supporting the Assistant API were made, and we are planning to support the Assistant API! Stay tuned for more! - Now LocalAI supports the Tools API endpoint - it also supports the (now deprecated) functions API call as usual. We now also have support for SSE with function calling. See [https://github.com/mudler/LocalAI/pull/1726](https://github.com/mudler/LocalAI/pull/1726) for more - Support for Gemma models - did you hear? Google released OSS models and LocalAI supports it already! - Thanks to [@&#8203;dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/1728](https://github.com/mudler/LocalAI/pull/1728) to put efforts in refactoring parts of the code - we are going to support soon more ways to interface with LocalAI, and not only restful api! ##### Support the project First off, a massive thank you to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say! And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community. Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using [@&#8203;LocalAI_OSS](https://twitter.com/LocalAI_API) and [@&#8203;mudler_it](https://twitter.com/mudler_it) or joining our sponsorship program can make a big difference. Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together! Thanks a ton, and here's to more exciting times ahead with LocalAI! 🚀 ##### What's Changed ##### Bug fixes 🐛 - Add TTS dependency for cuda based builds fixes [#&#8203;1727](https://github.com/mudler/LocalAI/issues/1727) by [@&#8203;blob42](https://github.com/blob42) in [https://github.com/mudler/LocalAI/pull/1730](https://github.com/mudler/LocalAI/pull/1730) ##### Exciting New Features 🎉 - Build docker container for ROCm by [@&#8203;fenfir](https://github.com/fenfir) in [https://github.com/mudler/LocalAI/pull/1595](https://github.com/mudler/LocalAI/pull/1595) - feat(tools): support Tool calls in the API by [@&#8203;mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/1715](https://github.com/mudler/LocalAI/pull/1715) - Initial implementation of upload files api. by [@&#8203;christ66](https://github.com/christ66) in [https://github.com/mudler/LocalAI/pull/1703](https://github.com/mudler/LocalAI/pull/1703) - feat(tools): Parallel function calling by [@&#8203;mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/1726](https://github.com/mudler/LocalAI/pull/1726) - refactor: move part of api packages to core by [@&#8203;dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/1728](https://github.com/mudler/LocalAI/pull/1728) - deps(llama.cpp): update, support Gemma models by [@&#8203;mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/1734](https://github.com/mudler/LocalAI/pull/1734) ##### 👒 Dependencies - deps(llama.cpp): update by [@&#8203;mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/1714](https://github.com/mudler/LocalAI/pull/1714) - ⬆️ Update ggerganov/llama.cpp by [@&#8203;localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1740](https://github.com/mudler/LocalAI/pull/1740) ##### Other Changes - ⬆️ Update docs version mudler/LocalAI by [@&#8203;localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1718](https://github.com/mudler/LocalAI/pull/1718) - ⬆️ Update ggerganov/llama.cpp by [@&#8203;localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1705](https://github.com/mudler/LocalAI/pull/1705) - Update README.md by [@&#8203;lunamidori5](https://github.com/lunamidori5) in [https://github.com/mudler/LocalAI/pull/1739](https://github.com/mudler/LocalAI/pull/1739) - ⬆️ Update ggerganov/llama.cpp by [@&#8203;localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1750](https://github.com/mudler/LocalAI/pull/1750) ##### New Contributors - [@&#8203;fenfir](https://github.com/fenfir) made their first contribution in [https://github.com/mudler/LocalAI/pull/1595](https://github.com/mudler/LocalAI/pull/1595) - [@&#8203;christ66](https://github.com/christ66) made their first contribution in [https://github.com/mudler/LocalAI/pull/1703](https://github.com/mudler/LocalAI/pull/1703) - [@&#8203;blob42](https://github.com/blob42) made their first contribution in [https://github.com/mudler/LocalAI/pull/1730](https://github.com/mudler/LocalAI/pull/1730) **Full Changelog**: mudler/LocalAI@v2.8.2...v2.9.0 </details> --- ### Configuration 📅 **Schedule**: Branch creation - "before 10pm on monday" in timezone Europe/Amsterdam, Automerge - At any time (no schedule defined). 🚦 **Automerge**: Enabled. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about these updates again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4yMTMuMCIsInVwZGF0ZWRJblZlciI6IjM3LjIxMy4wIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIn0=-->
1 parent 0bbfa40 commit 50bf219

File tree

2 files changed

+8
-8
lines changed

2 files changed

+8
-8
lines changed

charts/stable/local-ai/Chart.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ annotations:
77
truecharts.org/min_helm_version: "3.12"
88
truecharts.org/train: stable
99
apiVersion: v2
10-
appVersion: 2.8.2
10+
appVersion: 2.9.0
1111
dependencies:
1212
- name: common
1313
version: 18.0.1
@@ -34,4 +34,4 @@ sources:
3434
- https://github.com/truecharts/charts/tree/master/charts/stable/local-ai
3535
- https://hub.docker.com/r/localai/localai
3636
type: application
37-
version: 9.1.0
37+
version: 9.7.0

charts/stable/local-ai/values.yaml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,27 @@
11
image:
22
repository: docker.io/localai/localai
33
pullPolicy: IfNotPresent
4-
tag: v2.8.2@sha256:7fea9bf502d71fd5330cd5d3e7e4910db484cfe97a56431d84c113ee03ef4546
4+
tag: v2.9.0@sha256:1a9b06fc2e0f8c2e046816b0f472904ecfb99b2928c8448d4bbadc06cdea4021
55
ffmpegImage:
66
repository: docker.io/localai/localai
77
pullPolicy: IfNotPresent
8-
tag: v2.8.2-ffmpeg-core@sha256:f7bdc8d64650d33a09b6a13f5cbddb59a807908fc2afecde29edf4ac56b6c6f7
8+
tag: v2.9.0-ffmpeg-core@sha256:26750f6ed498c898658862b2c8e341f8871ac8130e3e0576313fa973a5068fbe
99
cublasCuda12Image:
1010
repository: docker.io/localai/localai
1111
pullPolicy: IfNotPresent
12-
tag: v2.8.2-cublas-cuda12-core@sha256:6e9fdcbd8ee69962a411114b60622da1bb800b28c0d9f86a459f40f94816848c
12+
tag: v2.9.0-cublas-cuda12-core@sha256:8416d0c1df61b33a387c1d879654461573a71813aac8eaaa65751fe7e11af22b
1313
cublasCuda12FfmpegImage:
1414
repository: docker.io/localai/localai
1515
pullPolicy: IfNotPresent
16-
tag: v2.8.2-cublas-cuda12-ffmpeg-core@sha256:09c3531e2904f883f9580b40070a3ed84e71fbce08a7c5c5ecc095cb6f5030ad
16+
tag: v2.9.0-cublas-cuda12-ffmpeg-core@sha256:f4d9d6c804b43ead0f38436c3a7757a9210d0ab56127f47ad111c6e22c509e76
1717
cublasCuda11Image:
1818
repository: docker.io/localai/localai
1919
pullPolicy: IfNotPresent
20-
tag: v2.8.2-cublas-cuda11-core@sha256:9df853000abb7bdb15cac5239077694289c2bda8da19067d9b434e61755c06c3
20+
tag: v2.9.0-cublas-cuda11-core@sha256:319d58c9ddfc806e9e81c58ed9d17186e41945b1fd72582fbff6403a848de05e
2121
cublasCuda11FfmpegImage:
2222
repository: docker.io/localai/localai
2323
pullPolicy: IfNotPresent
24-
tag: v2.8.2-cublas-cuda11-ffmpeg-core@sha256:74b285bc21afd8ce90571bdbfa47d5aba7021a23f43bc99968cc606e52759e84
24+
tag: v2.9.0-cublas-cuda11-ffmpeg-core@sha256:35ea8f29265f20252b1284e2f813af7f630ff7fd3e2207d1296aaf38fe7d961b
2525
securityContext:
2626
container:
2727
runAsNonRoot: false

0 commit comments

Comments
 (0)