You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a Copilot subscription. Is there something I need to configure for this to work?
/workspaces/seclab-taskflows (main) $ ./scripts/audit/run_audit.sh ekse/libwasm-vulnerable
** 🤖💪 Running Task Flow: seclab_taskflows.taskflows.audit.fetch_source_code
** 🤖💪 Deploying Task Flow Agent(s): ['seclab_taskflow_agent.personalities.assistant']
** 🤖💪 Task ID: 281a3dbe-89b3-4e4a-8ef7-652ca3b7411a
** 🤖💪 Model : gpt-4o
CRITICAL: Required environment variable DATA_DIR not found!
CRITICAL: Required environment variable LOG_DIR not found!
[03/07/26 17:28:51] INFO Starting MCP server 'LocalGHResources' with transport 'stdio' server.py:2491
** 🤖🛠️ Tool Call: c2052fa68d5bfetch_repo_from_gh
** 🤖💪 Running Task Flow: seclab_taskflows.taskflows.audit.identify_applications
** 🤖💪 Deploying Task Flow Agent(s): ['seclab_taskflow_agent.personalities.assistant']
** 🤖💪 Task ID: a3f89c05-3534-4356-a4a8-3728d5058c25
** 🤖💪 Model : gpt-4o
CRITICAL: Required environment variable MEMCACHE_STATE_DIR not found!
CRITICAL: Required environment variable MEMCACHE_BACKEND not found!
CRITICAL: Required environment variable LOG_DIR not found!
CRITICAL: Required environment variable DATA_DIR not found!
CRITICAL: Required environment variable LOG_DIR not found!
[03/07/26 17:28:54] INFO Starting MCP server 'Memcache' with transport 'stdio' server.py:2491
[03/07/26 17:28:56] INFO Starting MCP server 'RepoContext' with transport 'stdio' server.py:2491
** 🤖🛠️ Tool Call: 7ff9e676e031memcache_clear_cache
** 🤖🛠️ Tool Call: aed4eb5b8e06clear_repo
The memory cache has been cleared, and all results for the repository `ekse/libwasm-vulnerable` have been cleared.
** 🤖💪 Deploying Task Flow Agent(s): ['seclab_taskflows.personalities.web_application_security_expert']
** 🤖💪 Task ID: 91954424-f823-4424-8f41-30f1109c2692
** 🤖💪 Model : gpt-5.2, params: {'temperature': 1, 'reasoning': {'effort': 'high'}}
CRITICAL: Required environment variable DATA_DIR not found!
CRITICAL: Required environment variable LOG_DIR not found!
[03/07/26 17:29:00] INFO Starting MCP server 'GitHubFileViewer' with transport 'stdio' server.py:2491
[03/07/26 17:29:02] INFO Starting MCP server 'RepoContext' with transport 'stdio' server.py:2491
** 🤖❗ Request Error: Error code: 400 - {'error': {'message': 'The requested model is not supported.', 'code': 'model_not_supported', 'param': 'model', 'type': 'invalid_request_error'}}
ERROR: Bad Request
Traceback (most recent call last):
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/seclab_taskflow_agent/__main__.py", line 366, in deploy_task_agents
await _run_streamed()
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/seclab_taskflow_agent/__main__.py", line 346, in _run_streamed
async for event in result.stream_events():
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/agents/result.py", line 215, in stream_events
raise self._stored_exception
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/agents/run.py", line 840, in _start_streaming
turn_result = await cls._run_single_turn_streamed(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/agents/run.py", line 1009, in _run_single_turn_streamed
async for event in model.stream_response(
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/agents/models/openai_chatcompletions.py", line 158, in stream_response
response, stream = await self._fetch_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/agents/models/openai_chatcompletions.py", line 276, in _fetch_response
ret = await self._get_client().chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 2583, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1794, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/seclab-taskflows/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1594, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'The requested model is not supported.', 'code': 'model_not_supported', 'param': 'model', 'type': 'invalid_request_error'}}
CRITICAL: Required task not completed ... aborting!
🤖💥 *Required task not completed ...
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I saw How to scan for vulnerabilities with GitHub Security Lab’s open source AI-powered framework on the github blog and tried following the instructions to run the tool in a codespace. I am getting "model not supported" errors in the output. (I removed the Github Security banner as it takes a lot of space).
I have a Copilot subscription. Is there something I need to configure for this to work?
Beta Was this translation helpful? Give feedback.
All reactions