Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TheUnhandled exception. The parameter is incorrect. #883

Open
sirredbeard opened this issue Sep 9, 2024 · 0 comments
Open

TheUnhandled exception. The parameter is incorrect. #883

sirredbeard opened this issue Sep 9, 2024 · 0 comments

Comments

@sirredbeard
Copy link
Contributor

Describe the bug

Running directml/directml-int4-awq-block-128 from microsoft/Phi-3-mini-4k-instruct-onnx using LabsPhi301 base on Windows on Arm on Windows Dev Kit 2023 results in error:

dotnet run
unknown ARM CPU part 0xd4b ignored
unknown ARM CPU part 0xd4b ignored
unknown ARM CPU part 0xd4b ignored
unknown ARM CPU part 0xd4b ignored
unknown ARM CPU part 0xd4c ignored
unknown ARM CPU part 0xd4c ignored
unknown ARM CPU part 0xd4c ignored
unknown ARM CPU part 0xd4c ignored
Ask your question. Type an empty string to exit.

Q: Why is Arm advantageous over x86?
Phi3:  TheUnhandled exception. Microsoft.ML.OnnxRuntimeGenAI.OnnxRuntimeGenAIException: D:\a\_work\1\onnxruntime-genai\src\dml\dml_update_mask_kernel.cpp(72)\onnxruntime-genai.DLL!00007FFDACF331E0: (caller: 00007FFDACF13880) Exception(1) tid(4dd0) 80070057 The parameter is incorrect.

   at Microsoft.ML.OnnxRuntimeGenAI.Generator.ComputeLogits()
   at Program.<Main>$(String[] args) in D:\Program.cs:line 63

To Reproduce
Steps to reproduce the behavior:

  1. git clone https://github.com/microsoft/Phi-3CookBook/
  2. Navigate to /md/07.Labs/Csharp/src/LabsPhi301
  3. Set generatorParams.SetSearchOption("past_present_share_buffer", true); per Non-zero status code returned while running DmlFusedNode_0_0 node #863
  4. Copy directml/directml-int4-awq-block-128 from microsoft/Phi-3-mini-4k-instruct-onnx into place
  5. dotnet restore
  6. dotnet run

Expected behavior

Be able to chat with directml-int4-awq-block-128 build of Phi-3-mini-4k-instruct.

Screenshots

image

Desktop (please complete the following information):

  • Windows 11 on Arm
  • Windows Dev Kit 2023
  • .NET 8 latest SDK

Additional context

Program.cs:

using Microsoft.ML.OnnxRuntimeGenAI;

var modelPath = Path.Combine(AppContext.BaseDirectory, @"Phi-3-mini-4k-instruct-onnx\directml-int4-awq-block-128");
var model = new Model(modelPath);
var tokenizer = new Tokenizer(model);

var systemPrompt = "You are an AI assistant that helps people find information. Answer questions using a direct style. Do not share more information than is requested by the users.";

// chat start
Console.WriteLine(@"Ask your question. Type an empty string to exit.");

// chat loop
while (true)
{
    // Get user question
    Console.WriteLine();
    Console.Write(@"Q: ");
    var userQ = Console.ReadLine();    
    if (string.IsNullOrEmpty(userQ))
    {
        break;
    }

    // show phi3 response
    Console.Write("Phi3: ");
    var fullPrompt = $"<|system|>{systemPrompt}<|end|><|user|>{userQ}<|end|><|assistant|>";
    var tokens = tokenizer.Encode(fullPrompt);

    var generatorParams = new GeneratorParams(model);
    generatorParams.SetSearchOption("max_length", 1048);
    generatorParams.SetSearchOption("past_present_share_buffer", true);
    generatorParams.SetInputSequences(tokens);

    var generator = new Generator(model, generatorParams);
    while (!generator.IsDone())
    {
        generator.ComputeLogits();
        generator.GenerateNextToken();
        var outputTokens = generator.GetSequence(0);
        var newToken = outputTokens.Slice(outputTokens.Length - 1, 1);
        var output = tokenizer.Decode(newToken);
        Console.Write(output);
    }
    Console.WriteLine();
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants