Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
This repository was archived by the owner on Oct 25, 2024. It is now read-only.
Discussion options

Hi Team,

I have an issue with CPU XEON GOLD , Ubuntu VM.
How do I set up neural chat to AVX2?

BF16 weight prepack needs the cpu support avx512bw, avx512vl and avx512dq

[2024-04-23 00:52:31,123] [ ERROR] - Failed to start server.
[2024-04-23 00:52:31,123] [ ERROR] - BF16 weight prepack needs the cpu support avx512bw, avx512vl and avx512dq, but the desired instruction sets are not available. Please set dtype to torch.float or set weights_prepack to False.

You must be logged in to vote

Replies: 1 comment

Comment options

Hi @hnguy31-hvevn , yes I think IPEX may optimize too aggressively. There is actually a parameter to turn off the weights_prepack.

I've made a PR here #1526, could you have a look on whether this fix your issue?

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.