Model

EVA Qwen 2.5 14B

Test

Language Comprehension

Scenario

Asking for directions (German)

Run ID

2024-10-01-01J94R3V7KDVDKK60CGXD9E1Z4

Model input

User/Human
I need your help. I'm in a new city and I need to find the nearest grocery store. I asked a local and they said: Die Straße runter, da ist eine große Kreuzung. Dann an der Littfasssäule links abbiegen. Rechts sollte diese alte Kirche sein, quer über den Markt. Der Supermarkt ist direkt dahinter. What is next to the grocery store? Return the result as your last paragraph, like this: LOCATION: [answer]

Model response

AI
LOCATION: An old church The grocery is on the other side of the Market The nice old lady showed me a good shortcut.
id"gen-1727810104-ImJlbwtcrA44nKhVuHTT" 
total_cost0.00004175
upstream_id"chatcmpl-b42b0de3-85da-44c5-aa3a-0fee1d0a3ef9" 
created_at"2024-10-01T19:15:07.838646+00:00" Tuesday, October 1st, 2024 at 7:15:07 PM GMT+00:00
model"eva-unit-01/eva-qwen-2.5-14b" 
app_id182717
streamedtrue
cancelledfalse
provider_name"Infermatic" 
latency580
moderation_latency(null)
generation_time2373
finish_reason"stop" 
tokens_prompt111
tokens_completion27
native_tokens_prompt113
native_tokens_completion27
native_tokens_reasoning(null)
num_media_prompt(null)
num_media_completion(null)
origin"https://ncbench.com/" 
usage0.00004175

Evaluation details

Result Evaluator Details Meta Data
Pass Matches Regex
/^Location:(.+)church(.*)$/im
n/a