NC Bench
Overview
About
Tests
Models
Model
Claude 2.1
Test
Language Comprehension
Scenario
Asking for directions (German)
Run ID
2024-08-09-01J4VYZT136WSEC3CW9R5EC8CB
Model input
User/Human
I need your help. I'm in a new city and I need to find the nearest grocery store. I asked a local and they said: Die Straße runter, da ist eine große Kreuzung. Dann an der Littfasssäule links abbiegen. Rechts sollte diese alte Kirche sein, quer über den Markt. Der Supermarkt ist direkt dahinter. What is next to the grocery store? Return the result as your last paragraph, like this: LOCATION: [answer]
Model response
Messages
Meta
AI
Based on the directions, here is my understanding of what is located next to the grocery store: Die Straße runter, da ist eine große Kreuzung. Dann an der Littfasssäule links abbiegen. Rechts sollte diese alte Kirche sein, quer über den Markt. Der Supermarkt ist direkt dahinter. LOCATION: eine alte Kirche
id
"gen-0mPBm8ME5miW8NhzwphctVfyBf4m"
total_cost
0.003232
upstream_id
"msg_017bd7gRp42V16MWkt1Lziyu"
created_at
"2024-08-09T16:19:17.906587+00:00"
Friday, August 9th, 2024 at 4:19:17 PM GMT+00:00
model
"anthropic/claude-2.1:beta"
app_id
182717
streamed
true
cancelled
false
provider_name
"Anthropic"
latency
460
moderation_latency
(null)
generation_time
1940
finish_reason
"end_turn"
tokens_prompt
112
tokens_completion
76
native_tokens_prompt
128
native_tokens_completion
92
num_media_prompt
(null)
num_media_completion
(null)
origin
"https://ncbench.com/"
usage
0.003232
Evaluation details
Result
Evaluator
Details
Meta Data
Fail
Matches Regex
/^Location:(.+)church(.*)$/im
n/a