Back Synchronous vs Asynchronous execution | Coroutine| Event Loop 06 Mar, 2026

1️⃣ Synchronous vs Asynchronous execution

2️⃣ Coroutine

3️⃣ Event Loop

Once these are clear, the difference between:

await self.app.ainvoke()

and

asyncio.run(self.app.ainvoke())

becomes very simple.


1️⃣ Synchronous Execution (Normal Python)

Most Python code runs synchronously.

Example:

import time

def task():
    print("Start")
    time.sleep(3)
    print("End")

task()
print("Done")

Execution flow:

Start
(wait 3 seconds)
End
Done

Python waits for each task to finish.

Task1 → finish → Task2 → finish → Task3

This is called blocking execution.


2️⃣ Asynchronous Execution

Async allows multiple tasks to run without waiting.

Example:

import asyncio

async def task():
    print("Start")
    await asyncio.sleep(3)
    print("End")

asyncio.run(task())

Key difference:

await asyncio.sleep(3)

Instead of blocking, Python says:

"While waiting, I can do something else."

Execution model:

Task1 waiting
Task2 running
Task3 running

This improves performance in I/O operations like:

  • API calls

  • database queries

  • LLM requests

  • web scraping

Your Groq call is an I/O operation.


3️⃣ What is a Coroutine?

A coroutine is an async function.

Example:

async def fetch_data():
    return "data"

Calling it:

fetch_data()

does NOT run it.

It returns a coroutine object:

<coroutine object fetch_data at 0x123>

To execute it you must use:

await

or

asyncio.run()

4️⃣ What is an Event Loop?

The event loop is the engine that runs async code.

Think of it like a task manager.

Event Loop
   │
   ├── Task 1
   ├── Task 2
   ├── Task 3
   └── Task 4

It decides:

  • which task runs

  • which task waits

  • which task resumes


5️⃣ Now Let's Understand Your Two Statements

Case 1

result = await self.app.ainvoke(initial_question)

This means:

"Run this coroutine inside the existing event loop."

But it only works inside an async function.

Example:

async def run_pipeline():

    result = await self.app.ainvoke(initial_question)

    print(result)

Flow:

Async function
      │
      ▼
Event loop already running
      │
      ▼
await ainvoke()
      │
      ▼
Return result

6️⃣ Case 2

result = asyncio.run(self.app.ainvoke(initial_question))

This means:

"Create an event loop and execute this coroutine."

Used in normal synchronous scripts.

Example:

def run_pipeline():

    result = asyncio.run(self.app.ainvoke(initial_question))

    print(result)

Flow:

Normal Python script
      │
      ▼
asyncio.run()
      │
      ▼
Create event loop
      │
      ▼
Run coroutine
      │
      ▼
Close loop

7️⃣ Visual Comparison

await

Async function
      │
      ▼
Existing event loop
      │
      ▼
Run coroutine

asyncio.run()

Normal script
      │
      ▼
asyncio.run()
      │
      ▼
Create event loop
      │
      ▼
Run coroutine
      │
      ▼
Destroy loop

8️⃣ Why Python Restricts await

This is illegal:

def run():
    result = await my_function()

Error:

SyntaxError: 'await' outside async function

Because no event loop exists.

Python doesn't know where to run it.


9️⃣ Real Production Example

FastAPI

FastAPI already runs an event loop.

Client Request
      │
      ▼
FastAPI event loop
      │
      ▼
await ainvoke()

Example:

@app.post("/chat")
async def chat(req: Request):

    result = await app.ainvoke({
        "messages":[HumanMessage(content=req.question)]
    })

    return result

Here asyncio.run() would crash.


🔟 EXAMPLE: CLI Pipeline

Suppose You are running:

python -m prod_assistant.workflow.agentic_workflow

So this is synchronous environment.

Best options:

Option 1 (simplest)

result = self.app.invoke(initial_question)

Option 2

result = asyncio.run(self.app.ainvoke(initial_question))

1️⃣1️⃣ Performance Difference

Featureawaitasyncio.run
Event loopexistingcreated
Speedfasterslightly slower
Use caseasync appsscripts
Concurrencyhighmoderate

1️⃣2️⃣ In LangGraph

LangGraph supports:

invoke()
ainvoke()
stream()
astream()

Example:

invoke → sync
ainvoke → async
stream → sync streaming
astream → async streaming

1️⃣3️⃣ What Advanced AI Systems Use

Modern AI assistants use async streaming:

User Question
      │
      ▼
LangGraph
      │
      ▼
astream()
      │
      ▼
Tokens stream to UI

This creates the ChatGPT typing effect.


Simple Rule

Normal Python script → asyncio.run()

Async framework (FastAPI, LangServe) → await

🔥 Why LangGraph uses async internally

This allows LLM + Retriever + WebSearch + Tools to run concurrently, making AI agents 3-5x faster.