Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
stop publicly exposing mode
  • Loading branch information
hsubox76 committed Sep 2, 2025
commit cadfd095c226c96ee7097c9bc9f7502c9711e184
1 change: 0 additions & 1 deletion common/api-review/ai.api.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,6 @@ export interface ChromeAdapter {
generateContent(request: GenerateContentRequest): Promise<Response>;
generateContentStream(request: GenerateContentRequest): Promise<Response>;
isAvailable(request: GenerateContentRequest): Promise<boolean>;
mode: InferenceMode;
}

// @public
Expand Down
16 changes: 0 additions & 16 deletions docs-devsite/ai.chromeadapter.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,6 @@ These methods should not be called directly by the user.
export interface ChromeAdapter
```

## Properties

| Property | Type | Description |
| --- | --- | --- |
| [mode](./ai.chromeadapter.md#chromeadaptermode) | [InferenceMode](./ai.md#inferencemode) | The inference mode. |

## Methods

| Method | Description |
Expand All @@ -34,16 +28,6 @@ export interface ChromeAdapter
| [generateContentStream(request)](./ai.chromeadapter.md#chromeadaptergeneratecontentstream) | Generates a content stream using on-device inference. |
| [isAvailable(request)](./ai.chromeadapter.md#chromeadapterisavailable) | Checks if the on-device model is capable of handling a given request. |

## ChromeAdapter.mode

The inference mode.

<b>Signature:</b>

```typescript
mode: InferenceMode;
```

## ChromeAdapter.generateContent()

Generates content using on-device inference.
Expand Down
4 changes: 2 additions & 2 deletions packages/ai/src/methods/helpers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

import { AIError } from '../errors';
import { GenerateContentRequest, InferenceMode, AIErrorCode } from '../types';
import { ChromeAdapter } from '../types/chrome-adapter';
import { ChromeAdapterImpl } from './chrome-adapter';

/**
* Dispatches a request to the appropriate backend (on-device or in-cloud)
Expand All @@ -31,7 +31,7 @@ import { ChromeAdapter } from '../types/chrome-adapter';
*/
export async function callCloudOrDevice<Response>(
request: GenerateContentRequest,
chromeAdapter: ChromeAdapter | undefined,
chromeAdapter: ChromeAdapterImpl | undefined,
onDeviceCall: () => Promise<Response>,
inCloudCall: () => Promise<Response>
): Promise<Response> {
Expand Down
6 changes: 0 additions & 6 deletions packages/ai/src/types/chrome-adapter.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
*/

import { CountTokensRequest, GenerateContentRequest } from './requests';
import { InferenceMode } from './enums';

/**
* <b>(EXPERIMENTAL)</b> Defines an inference "backend" that uses Chrome's on-device model,
Expand All @@ -28,11 +27,6 @@ import { InferenceMode } from './enums';
* @public
*/
export interface ChromeAdapter {
/**
* The inference mode.
*/
mode: InferenceMode;

/**
* Checks if the on-device model is capable of handling a given
* request.
Expand Down
Loading