feat: DAG executor async + intégration IA/LLM dans le VWB

- DAGExecutor : exécution workflow par graphe de dépendances,
  étapes LLM parallèles, UI séquentielles, injection ${step.result}
- LLMActionHandler : analyze_text, translate, extract_data, generate_text
  via Ollama /api/chat (qwen3-vl:8b, temperature 0.1)
- VWB palette : catégorie "IA / LLM" avec 4 actions draggables
- VWB propriétés : éditeurs pour chaque action LLM (modèle, prompt, langue)
- VWB endpoint : POST /api/v3/workflow/<id>/execute-dag
- 37 tests unitaires DAG executor (tous passent)
- Fix log spam cache workflows (info → debug)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Dom
2026-03-16 22:58:44 +01:00
parent ad15237fe0
commit 5e3865d328
11 changed files with 2911 additions and 2 deletions

View File

@@ -314,6 +314,35 @@ VWB_ACTION_CONTRACTS: Dict[str, ActionContract] = {
optional_params=["match_mode", "case_sensitive"],
param_validators={"visual_anchor": lambda p: has_visual_anchor({"visual_anchor": p})}
),
# --- ACTIONS DAG LLM — Exécutées via le DAGExecutor ---
"llm_analyze": ActionContract(
action_type="llm_analyze",
description="Analyser/résumer un texte via LLM (DAGExecutor)",
required_params=[],
optional_params=["text", "instruction", "model", "temperature"],
),
"llm_translate": ActionContract(
action_type="llm_translate",
description="Traduire un texte via LLM (DAGExecutor)",
required_params=[],
optional_params=["text", "target_lang", "source_lang", "model", "temperature"],
),
"llm_extract_data": ActionContract(
action_type="llm_extract_data",
description="Extraire des données structurées d'un texte via LLM (DAGExecutor)",
required_params=[],
optional_params=["text", "schema", "model", "temperature"],
),
"llm_generate": ActionContract(
action_type="llm_generate",
description="Générer du texte via LLM (DAGExecutor)",
required_params=[],
optional_params=["prompt", "context", "model", "temperature"],
),
}