Performance Reporting
The TypeSpec compiler can report performance statistics after compilation, helping you identify bottlenecks and optimize your build process. Enable this feature by passing the --stats flag to the CLI:
tsp compile . --statstsp compile . --statsTypeSpec compiler v1.9.0
Compilation completed successfully.
Compiler statistics: Complexity: Created types: 494 Finished types: 319 Performance: loader: 19ms resolver: 6ms checker: 8ms validation: 0ms compiler: 0ms linter: 0msThe report includes:
- Complexity metrics: Number of types created and finished during compilation
- Performance breakdown: Time spent in each compilation phase (loading, resolving, type-checking, validation, and linting)
Emitter Performance Reporting
Section titled āEmitter Performance ReportingāEmitters can report their own performance statistics, which are displayed alongside the compiler metrics in the same report.
Use the EmitContext.perf API to instrument your emitter code. The API provides several methods depending on your use case.
startTimer - Manual Timer Control
Section titled āstartTimer - Manual Timer ControlāBest for when the start and stop points are in different parts of your code, or when you need conditional timing:
const timer = context.perf.startTimer("my-task");
// ... do some work across multiple statements
timer.stop();time - Synchronous Function Timing
Section titled ātime - Synchronous Function TimingāBest for wrapping synchronous code blocks. Returns the result of the callback function:
const result = context.perf.time("my-task", () => { // ... do some work return computedValue;});timeAsync - Asynchronous Function Timing
Section titled ātimeAsync - Asynchronous Function TimingāBest for wrapping async operations. Returns a promise with the callbackās result:
const result = await context.perf.timeAsync("my-task", async () => { // ... do some async work return await fetchData();});reportTime - Report Pre-measured Duration
Section titled āreportTime - Report Pre-measured DurationāBest when you already have timing data from another source (e.g., a child process or external tool):
const { duration } = runTask();context.perf.reportTime("my-task", duration);You can use the standalone perf utilities to measure duration in code that doesnāt have access to the emit context:
import { perf } from "@typespec/compiler/utils";
function runTask(): { duration: number } { const timer = perf.startTimer(); // ... do some work return { duration: timer.end() };}Complete Example
Section titled āComplete ExampleāHereās how to instrument a typical emitter with multiple phases:
import { EmitContext } from "@typespec/compiler";
export async function $onEmit(context: EmitContext) { // Manual timer for the preparation phase const timer = context.perf.startTimer("prepare"); prepare(); timer.stop();
// Wrap synchronous rendering with automatic timing const renderResult = context.perf.time("render", () => render());
// Wrap async file writing with automatic timing await context.perf.timeAsync("write", async () => writeOutput(renderResult));}Running tsp compile . --stats with this instrumented emitter produces:
tsp compile . --statsTypeSpec compiler v1.9.0
Compilation completed successfully.
Compiler statistics: Complexity: Created types: 494 Finished types: 319 Performance: loader: 19ms resolver: 6ms checker: 8ms validation: 0ms compiler: 0ms linter: 0ms emit: 128ms my-emitter: 128ms prepare: 39ms render: 28ms write: 51msThe emitterās custom metrics (prepare, render, write) appear nested under the emitter name, giving you a clear breakdown of where time is spent during code generation.