NC Bench
Overview
About
Tests
Models
Model
EVA Qwen 2.5 14B
Test
Language Comprehension
Scenario
Asking for directions (German)
Run ID
2024-10-01-01J94QY3MPS77MX2WW1SZH64XE
Model input
User/Human
I need your help. I'm in a new city and I need to find the nearest grocery store. I asked a local and they said: Die Straße runter, da ist eine große Kreuzung. Dann an der Littfasssäule links abbiegen. Rechts sollte diese alte Kirche sein, quer über den Markt. Der Supermarkt ist direkt dahinter. What is next to the grocery store? Return the result as your last paragraph, like this: LOCATION: [answer]
Model response
Messages
Meta
AI
The grocery store is across the street from the old church, which is on the right side of the market. LOCATION: old church across from the grocery store
id
"gen-1727809916-g5lCXtmn0aKi3hWQrN6m"
total_cost
0.00004425
upstream_id
"chatcmpl-d6651196-8b57-4864-aa75-1b12507ab2d6"
created_at
"2024-10-01T19:11:58.827621+00:00"
Tuesday, October 1st, 2024 at 7:11:58 PM GMT+00:00
model
"eva-unit-01/eva-qwen-2.5-14b"
app_id
182717
streamed
true
cancelled
false
provider_name
"Infermatic"
latency
143
moderation_latency
(null)
generation_time
1807
finish_reason
"stop"
tokens_prompt
111
tokens_completion
31
native_tokens_prompt
113
native_tokens_completion
32
native_tokens_reasoning
(null)
num_media_prompt
(null)
num_media_completion
(null)
origin
"https://ncbench.com/"
usage
0.00004425
Evaluation details
Result
Evaluator
Details
Meta Data
Pass
Matches Regex
/^Location:(.+)church(.*)$/im
n/a