IDA-VLM
Collection
3 items
•
Updated
Error code: DatasetGenerationError Exception: ArrowInvalid Message: JSON parse error: Invalid value. in row 0 Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 160, in _generate_tables df = pandas_read_json(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json return pd.read_json(path_or_buf, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 791, in read_json json_reader = JsonReader( File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 905, in __init__ self.data = self._preprocess_data(data) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 917, in _preprocess_data data = data.read() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 826, in read_with_retries out = read(*args, **kwargs) File "/usr/local/lib/python3.9/codecs.py", line 322, in decode (result, consumed) = self._buffer_decode(data, self.errors, final) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1854, in _prepare_split_single for _, table in generator: File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 686, in wrapped for item in generator(*args, **kwargs): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 163, in _generate_tables raise e File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 137, in _generate_tables pa_table = paj.read_json( File "pyarrow/_json.pyx", line 308, in pyarrow._json.read_json File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: JSON parse error: Invalid value. in row 0 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1412, in compute_config_parquet_and_info_response parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet( File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 988, in stream_convert_to_parquet builder._prepare_split( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1741, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1897, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
id
int64 | conversations
list | type
string |
---|---|---|
0 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000511522_487711_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000511522.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(126,308),(294,944)</box>"
}
] | ground_ref |
1 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000322698_0_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000322698.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(261,388),(625,989)</box>"
}
] | ground_ref |
2 | [
{
"from": "user",
"value": "Aaron is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Relic/HaG2Mm5K6TE@5_0.jpg</img>\nPaul is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Relic/HaG2Mm5K6TE@5_1.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Relic/[email protected]</img>\nWhy is Paul crying?"
},
{
"from": "assistant",
"value": "Paul received bad news from Aaron."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Paul has her hand to her face like she is crying and has her back to Aaron."
}
] | vqa_vcr |
3 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000376371.jpg</img>\n\nManchego and Roquefort are cheeses got from which animal's milk?\nA. deer\nB. cow\nC. sheep\nD. goat\nAnswer with the option's letter from the given choices directly."
},
{
"from": "assistant",
"value": "C"
}
] | llava |
4 | [
{
"from": "user",
"value": "Given some people characters, Scott is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/83415/Scott.jpg</img>\n. Please explain what's happening in the photo <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/49402821.jpg</img>\n."
},
{
"from": "assistant",
"value": "Scott in a salmon skirt attempting to hit a soccer ball but missing ."
}
] | caption_f30k |
5 | [
{
"from": "user",
"value": "Matthew is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0041_The_Sixth_Sense/0041_The_Sixth_Sense_01.26.21.093-01.26.23.814@0_1.jpg</img>\nJennifer is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0041_The_Sixth_Sense/0041_The_Sixth_Sense_01.26.21.093-01.26.23.814@0_2.jpg</img>\nDiane is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0041_The_Sixth_Sense/0041_The_Sixth_Sense_01.26.21.093-01.26.23.814@0_7.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0041_The_Sixth_Sense/[email protected]</img>\nWhy would Matthew and Diane be mad at this person?"
},
{
"from": "assistant",
"value": "They feel what they are doing is inappropriate."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "The interruption has saddened the mourner Jennifer and Matthew and Diane likely judge this person for it."
}
] | vqa_vcr |
6 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000006051_200817_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000006051.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(1,2),(628,775)</box>"
}
] | ground_ref |
7 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Harts_War/[email protected]</img>\nthere are some object or person images. Charles: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Harts_War/qA73y2w3WUc@10_0.jpg</img>\nBrandon: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Harts_War/qA73y2w3WUc@10_2.jpg</img>\nWhy does Brandon look so serious?"
},
{
"from": "assistant",
"value": "Brandon has just read Charless sentence in a court case and it is not good."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Brandon is holding a piece of paper looking at it and looking serious, Charles has a stunned look on his face and is clenching his hands together on the table, there is a gavel on the table,there is a crowd behind Charles with hard looks on their faces."
}
] | vqa_vcr |
8 | [
{
"from": "user",
"value": "Given some people characters, Thomas is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/146981/Thomas.jpg</img>\n. Please provide an overview of the visual information in this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/4672056076.jpg</img>\n."
},
{
"from": "assistant",
"value": "Thomas in a gray hat and tank top with black shorts doing a handstand ."
}
] | caption_f30k |
9 | [
{
"from": "user",
"value": "Given some people characters, Jack is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/36689/Jack.jpg</img>\n, Aaron is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/36689/Aaron.jpg</img>\n, Ashley is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/36689/Ashley.jpg</img>\n, Carolyn is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/36689/Carolyn.jpg</img>\n. Tell me about the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/957585444.jpg</img>\n."
},
{
"from": "assistant",
"value": "Jack, Aaron, Ashley and Carolyn are looking at a picture ."
}
] | caption_f30k |
10 | [
{
"from": "user",
"value": "Aaron is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Independence_Day/NyOTaHRBTXc@9_2.jpg</img>\nGregory is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Independence_Day/NyOTaHRBTXc@9_5.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Independence_Day/[email protected]</img>\nWhat are Aaron and Gregory employed as?"
},
{
"from": "assistant",
"value": "They are employed as surveillance workers"
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Aaron and Gregory are dressed in officer's uniform and are tracking information on a radar system"
}
] | vqa_vcr |
11 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3061_SNOW_FLOWER/[email protected]</img>\nthere are some object or person images. Amanda: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3061_SNOW_FLOWER/3061_SNOW_FLOWER_00.07.02.273-00.07.05.113@0_0.jpg</img>\nJerry: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3061_SNOW_FLOWER/3061_SNOW_FLOWER_00.07.02.273-00.07.05.113@0_1.jpg</img>\nWhere are Amanda and Jerry?"
},
{
"from": "assistant",
"value": "They are in a living room."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "They seem to be in the living room of their home because they both seem to feel comfortable to be there."
}
] | vqa_vcr |
12 | [
{
"from": "user",
"value": ". Examine the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/7150152099.jpg</img>\n and describe its content."
},
{
"from": "assistant",
"value": "A high jumper in a mostly-empty arena begins to lift herself over the bar using her pole ."
}
] | caption_f30k |
13 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000207543_1962065_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000207543.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(587,772),(1000,1000)</box>"
}
] | ground_ref |
14 | [
{
"from": "user",
"value": "Given some people characters, Emily is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/133960/Emily.jpg</img>\n. Tell me about the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/248151015.jpg</img>\n."
},
{
"from": "assistant",
"value": "Emily wearing jeans , a tank-top , and a work glove is bent overlooking at a wheelbarrow ."
}
] | caption_f30k |
15 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000297227_5_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000297227.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(725,89),(903,631)</box>"
}
] | ground_ref |
16 | [
{
"from": "user",
"value": "Larry is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1057_Seven_pounds/1057_Seven_pounds_01.56.31.623-01.56.38.169@0_1.jpg</img>\nStephen is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1057_Seven_pounds/1057_Seven_pounds_01.56.31.623-01.56.38.169@0_5.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1057_Seven_pounds/[email protected]</img>\nWhy is Larry smiling?"
},
{
"from": "assistant",
"value": "Larry is enjoying the company of Stephen."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Stephen is a female and possibly a love interest. Larry is looking right at her."
}
] | vqa_vcr |
17 | [
{
"from": "user",
"value": "Given some people characters, Paul is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/11355/Paul.jpg</img>\n. What do you see in this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3808149060.jpg</img>\n?"
},
{
"from": "assistant",
"value": "A group standing around in the background while Paul stretches with her arms up in the air in the doorway ."
}
] | caption_f30k |
18 | [
{
"from": "user",
"value": "Samuel is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3053_PARENTAL_GUIDANCE/3053_PARENTAL_GUIDANCE_01.01.44.728-01.01.55.426@4_1.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3053_PARENTAL_GUIDANCE/[email protected]</img>\nWhat sport is Samuel playing?"
},
{
"from": "assistant",
"value": "He is skateboarding on a ramp."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "He is riding a skateboard."
}
] | vqa_vcr |
19 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000361356.jpg</img>\n\nWhat is the name of the person making making this thread fruit designs?\nA. doctor\nB. none\nC. designer\nD. weaver\nAnswer with the option's letter from the given choices directly."
},
{
"from": "assistant",
"value": "D"
}
] | llava |
20 | [
{
"from": "user",
"value": "Given some people characters, Jerry is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/123526/Jerry.jpg</img>\n. For the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3264397357.jpg</img>\n, give a description."
},
{
"from": "assistant",
"value": "Jerry wearing a red scarf is standing on a hill by a winding road ."
}
] | caption_f30k |
21 | [
{
"from": "user",
"value": "Tyler is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3005_ABRAHAM_LINCOLN_VAMPIRE_HUNTER/3005_ABRAHAM_LINCOLN_VAMPIRE_HUNTER_00.51.06.716-00.51.09.787@0_3.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3005_ABRAHAM_LINCOLN_VAMPIRE_HUNTER/3005_ABRAHAM_LINCOLN_VAMPIRE_HUNTER_00.51.06.716-00.51.09.787@0.jpg</img>\nDid Tyler just get out of the casket?"
},
{
"from": "assistant",
"value": "Yes, Tyler did get out of the casket."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "The casket is slightly open signifying that something went in or out and Tyler is the closest one to the casket."
}
] | vqa_vcr |
22 | [
{
"from": "user",
"value": "Given some people characters, Larry is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/40213/Larry.jpg</img>\n, Virginia is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/40213/Virginia.jpg</img>\n. Examine the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3314788599.jpg</img>\n and describe its content."
},
{
"from": "assistant",
"value": "Larry and Virginia in blue jeans sit by a riverbank ."
}
] | caption_f30k |
23 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000375099_0_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000375099.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(3,310),(313,789)</box>"
}
] | ground_ref |
24 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000096897_1070942_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000096897.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(360,564),(878,882)</box>"
}
] | ground_ref |
25 | [
{
"from": "user",
"value": "Given some people characters, George is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/16951/George.jpg</img>\n. What is displayed in this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/6366234987.jpg</img>\n?"
},
{
"from": "assistant",
"value": "George on a ice rink in hockey attire wearing a # 39 \" Huppe \" jersey ."
}
] | caption_f30k |
26 | [
{
"from": "user",
"value": "Given some people characters, Amanda is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/134752/Amanda.jpg</img>\n. Examine the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/7502965482.jpg</img>\n and describe its content."
},
{
"from": "assistant",
"value": "Amanda marches at a rally to thank U.S. military while holding a flag ."
}
] | caption_f30k |
27 | [
{
"from": "user",
"value": "Emily is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Her_Minor_Thing/GC_ehA6dd_I@12_3.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Her_Minor_Thing/[email protected]</img>\nWhat has distracted Emily from what he was reading?"
},
{
"from": "assistant",
"value": "Emily has been distracted by something funny on TV."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Everyone in the room is looking in the same direction as Emily and a lot of them are laughing so something funny probably happened on the TV."
}
] | vqa_vcr |
28 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000205756_189137_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000205756.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(197,264),(527,909)</box>"
}
] | ground_ref |
29 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000537673.jpg</img>\n\nWhat might pose a risk for the person on the skateboard?"
},
{
"from": "assistant",
"value": "The person on the skateboard is jumping off a ledge in a park environment, which can pose risks such as falling or losing balance during the stunt. The uneven surface at the edge of the concrete, combined with the speed and force applied while performing the trick, could potentially result in injuries if the skateboarder misjudges their landing or does not maintain proper control over the board. Additionally, the presence of other people and obstacles in the area, such as pedestrians or bicycles, could raise the risk of collision and accidents if the skateboarder is not careful or attentive to their surroundings. Overall, the skateboarder needs to exercise caution, practice their skills, and be mindful of their environment to minimize the chances of injury or accidents."
}
] | llava |
30 | [
{
"from": "user",
"value": "Brandon is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Reader/6vjDZQtAKuc@16_1.jpg</img>\nRuth is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Reader/6vjDZQtAKuc@16_3.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Reader/[email protected]</img>\nWhat is Brandon doing?"
},
{
"from": "assistant",
"value": "He is reading to Ruth."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "He is holding a book and looking at her with his mouth open."
}
] | vqa_vcr |
31 | [
{
"from": "user",
"value": "Given some people characters, Jeffrey is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/61651/Jeffrey.jpg</img>\n, Nicole is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/61651/Nicole.jpg</img>\n. Please describe the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3920626767.jpg</img>\n."
},
{
"from": "assistant",
"value": "Jeffrey in a red suit is riding a white horse with a white and red ribbon tied into Nicole ."
}
] | caption_f30k |
32 | [
{
"from": "user",
"value": "Given some people characters, Sharon is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/103877/Sharon.jpg</img>\n. Analyze the contents of the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/4243530880.jpg</img>\n."
},
{
"from": "assistant",
"value": "Sharon with a black sombrero is riding a bull inside an arena ."
}
] | caption_f30k |
33 | [
{
"from": "user",
"value": "What is the cat doing in the image?\n<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000484173.jpg</img>\n"
},
{
"from": "assistant",
"value": "In the image, the cat is lying inside a mesh pocket of an open suitcase."
},
{
"from": "user",
"value": "Where is the cat located in relation to the suitcase?"
},
{
"from": "assistant",
"value": "The cat is located inside the side pocket or compartment of the suitcase, under the mesh lining."
},
{
"from": "user",
"value": "Is the cat on top or inside the luggage?"
},
{
"from": "assistant",
"value": "The cat is inside the luggage, specifically in the side pocket or compartment of the suitcase."
},
{
"from": "user",
"value": "What kind of luggage is the cat laying in?"
},
{
"from": "assistant",
"value": "The cat is laying in a suitcase, in a mesh pocket of the open suitcase."
}
] | llava |
34 | [
{
"from": "user",
"value": "Given some people characters, Barbara is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/56776/Barbara.jpg</img>\n. Please explain what's happening in the photo <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/372243070.jpg</img>\n."
},
{
"from": "assistant",
"value": "Barbara with a tattoo wearing a sleeveless tailcoat is balancing a unicycle on his chin in front of a building ."
}
] | caption_f30k |
35 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0001_American_Beauty/[email protected]</img>\nthere are some object or person images. Ashley: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0001_American_Beauty/0001_American_Beauty_00.31.34.890-00.31.37.039@0_0.jpg</img>\nEmily: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0001_American_Beauty/0001_American_Beauty_00.31.34.890-00.31.37.039@0_1.jpg</img>\nHow does person Emily feel?"
},
{
"from": "assistant",
"value": "Person Emily feels joyful."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Person Emily enjoyed the kiss she just shared with person Ashley and is feeling very happy."
}
] | vqa_vcr |
36 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000509023_1_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000509023.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(102,144),(392,673)</box>"
}
] | ground_ref |
37 | [
{
"from": "user",
"value": "What can one infer about the weather conditions during the flight?\n<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000563458.jpg</img>\n"
},
{
"from": "assistant",
"value": "From the image, one can infer that the weather conditions for the airplane in flight are ideal. The airplane is flying through a clear, bright blue sky with no clouds visible. Such conditions are usually favorable for a smooth and safe flight, as they indicate the absence of storms, turbulence, or other adverse weather phenomena that could negatively affect air travel."
}
] | llava |
38 | [
{
"from": "user",
"value": "Ruth is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Throwdown/TTOlRTmEdoY@32_0.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Throwdown/[email protected]</img>\nWhat is Ruth feeling?"
},
{
"from": "assistant",
"value": "Ruth is feeling nervous and apprehensive"
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "He is looking slightly downwards and his eyebrows are drawn slightly"
}
] | vqa_vcr |
39 | [
{
"from": "user",
"value": "Given some people characters, Dorothy is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/122482/Dorothy.jpg</img>\n, Frank is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/122482/Frank.jpg</img>\n, Amanda is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/122482/Amanda.jpg</img>\n, Victoria is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/122482/Victoria.jpg</img>\n. Tell me about the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3092756650.jpg</img>\n."
},
{
"from": "assistant",
"value": "Dorothy, Frank, Amanda and Victoria from another country look at the camera ."
}
] | caption_f30k |
40 | [
{
"from": "user",
"value": "Betty is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1059_The_devil_wears_prada/1059_The_devil_wears_prada_00.04.42.000-00.04.51.458@3_1.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1059_The_devil_wears_prada/[email protected]</img>\nIs Betty currently leaving?"
},
{
"from": "assistant",
"value": "Yes, she is leaving."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "She is standing near her desk and is picking up some of the objects from it."
}
] | vqa_vcr |
41 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000198969_1135147_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000198969.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(405,308),(895,674)</box>"
}
] | ground_ref |
42 | [
{
"from": "user",
"value": "Given some people characters, Raymond is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/26037/Raymond.jpg</img>\n. What is displayed in this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/4564192674.jpg</img>\n?"
},
{
"from": "assistant",
"value": "Raymond in a red hat , tie-dye purple shirt and black pants stands on a walkway ."
}
] | caption_f30k |
43 | [
{
"from": "user",
"value": "Given some people characters, Elizabeth is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/64866/Elizabeth.jpg</img>\n. What is displayed in this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/8088990803.jpg</img>\n?"
},
{
"from": "assistant",
"value": "Elizabeth sitting in a patch of grass with her blond doll next to her ."
}
] | caption_f30k |
44 | [
{
"from": "user",
"value": "Given some people characters, Jerry is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/56287/Jerry.jpg</img>\n. Please describe the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/4787655421.jpg</img>\n."
},
{
"from": "assistant",
"value": "Jerry in thin-metal armor , riding a horse and carrying a yellow and red shield ."
}
] | caption_f30k |
45 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Almost_Famous/[email protected]</img>\nthere are some object or person images. Christine: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Almost_Famous/fgMlyvq2oNo@8_0.jpg</img>\nDonald: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Almost_Famous/fgMlyvq2oNo@8_1.jpg</img>\nWhy does Christine look at Donald like that?"
},
{
"from": "assistant",
"value": "She is disappointed at what Donald did."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "Christine's face looks like it's filled with disappointment for Donald, and is giving her a lecture."
}
] | vqa_vcr |
46 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000017651_2_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000017651.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(46,399),(448,722)</box>"
}
] | ground_ref |
47 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/ocr_vqa/images/226311163.jpg</img>\n\nWho wrote this book?\nAnswer the question using a single word or phrase."
},
{
"from": "assistant",
"value": "Jane I. Guyer"
},
{
"from": "user",
"value": "What is the title of this book?"
},
{
"from": "assistant",
"value": "Marginal Gains: Monetary Transactions in Atlantic Africa (Lewis Henry Morgan Lecture Series)"
},
{
"from": "user",
"value": "What is the genre of this book?"
},
{
"from": "assistant",
"value": "Business & Money"
},
{
"from": "user",
"value": "Is this a financial book?"
},
{
"from": "assistant",
"value": "Yes"
}
] | llava |
48 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000026720.jpg</img>\n\nIs this a regular size pizza?\nAnswer the question using a single word or phrase."
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "Is this a meat lovers pizza?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "What kind of pizza is this?"
},
{
"from": "assistant",
"value": "Veggie"
},
{
"from": "user",
"value": "Does this pizza have pineapple on it?"
},
{
"from": "assistant",
"value": "No"
}
] | llava |
49 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000138201_433421_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000138201.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(291,262),(547,635)</box>"
}
] | ground_ref |
50 | [
{
"from": "user",
"value": "Given some people characters, Patrick is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/144690/Patrick.jpg</img>\n, Ashley is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/144690/Ashley.jpg</img>\n. Please explain what's happening in the photo <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2631799914.jpg</img>\n."
},
{
"from": "assistant",
"value": "Patrick are in a white room with many chairs watching Ashley and red and Ashley in blue play video games on a Wii ."
}
] | caption_f30k |
51 | [
{
"from": "user",
"value": "Given some people characters, Justin is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/10673/Justin.jpg</img>\n, Nancy is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/10673/Nancy.jpg</img>\n, Anna is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/10673/Anna.jpg</img>\n, Nicole is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/10673/Nicole.jpg</img>\n, Stephen is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/10673/Stephen.jpg</img>\n. Tell me about the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/4886389665.jpg</img>\n."
},
{
"from": "assistant",
"value": "Justin, Nancy, Anna, Nicole and Stephen talk outside of a store ."
}
] | caption_f30k |
52 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000276949_441999_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000276949.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(471,263),(713,877)</box>"
}
] | ground_ref |
53 | [
{
"from": "user",
"value": "Patricia is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1001_Flight/1001_Flight_00.02.39.099-00.02.44.687@1_0.jpg</img>\nimage 1 is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1001_Flight/1001_Flight_00.02.39.099-00.02.44.687@1_4.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1001_Flight/[email protected]</img>\nWhy is Patricia in her underwear?"
},
{
"from": "assistant",
"value": "She is getting dressed."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "She is just getting out of image 1 and is getting dressed to leave."
}
] | vqa_vcr |
54 | [
{
"from": "user",
"value": ". Please give a summary of the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2243904502.jpg</img>\n."
},
{
"from": "assistant",
"value": "A yellow dog running through a yard covered in leaves while holding a yellow toy in his mouth ."
}
] | caption_f30k |
55 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3040_JULIE_AND_JULIA/[email protected]</img>\nthere are some object or person images. Ruth: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3040_JULIE_AND_JULIA/3040_JULIE_AND_JULIA_00.59.53.576-01.00.04.749@1_0.jpg</img>\nWhy is Ruth looking into the mirror?"
},
{
"from": "assistant",
"value": "They want to see how they look."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "They appear to be getting ready to leave and are curious about their appearance."
}
] | vqa_vcr |
56 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000214373_2158438_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000214373.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(519,174),(773,1000)</box>"
}
] | ground_ref |
57 | [
{
"from": "user",
"value": "Frank is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3030_GROWN_UPS/3030_GROWN_UPS_00.18.40.276-00.18.41.799@0_7.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3030_GROWN_UPS/[email protected]</img>\nWhy is Frank smiling?"
},
{
"from": "assistant",
"value": "something unexpected has happened but he wanted it to."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "only two other people are smiling the rest look a little confused. so they were not expecting what has just happened."
}
] | vqa_vcr |
58 | [
{
"from": "user",
"value": "Given some people characters, Donald is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/82745/Donald.jpg</img>\n. Tell me about the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/497374910.jpg</img>\n."
},
{
"from": "assistant",
"value": "Donald is sitting in the library and sleeping with her head in a very big book ."
}
] | caption_f30k |
59 | [
{
"from": "user",
"value": "Given some people characters, Thomas is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/114895/Thomas.jpg</img>\n, Kimberly is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/114895/Kimberly.jpg</img>\n. What is displayed in this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2594640580.jpg</img>\n?"
},
{
"from": "assistant",
"value": "Thomas standing near Kimberly sitting on a swing on a playground ."
}
] | caption_f30k |
60 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000422064_442108_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000422064.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(606,302),(773,912)</box>"
}
] | ground_ref |
61 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Madonna_Truth_or_Dare/[email protected]</img>\nthere are some object or person images. Elizabeth: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Madonna_Truth_or_Dare/scDWdIV7Fes@1_10.jpg</img>\nLaura: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Madonna_Truth_or_Dare/scDWdIV7Fes@1_12.jpg</img>\nWhy do Laura and Elizabeth have their hands up?"
},
{
"from": "assistant",
"value": "They are dancing for the crowd."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Laura and Elizabeth appear to be dancing or about to dance, hence why the crowd is gathering around them. They don't appear to be talking or fighting so the other likely possibility which would draw such a crowd is dancing."
}
] | vqa_vcr |
62 | [
{
"from": "user",
"value": "Ronald is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1060_Yes_man/1060_Yes_man_01.17.35.350-01.17.43.006@3_1.jpg</img>\nimage 1 is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1060_Yes_man/1060_Yes_man_01.17.35.350-01.17.43.006@3_5.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1060_Yes_man/[email protected]</img>\nDoes Ronald want image 1?"
},
{
"from": "assistant",
"value": "No she does not."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "She does not look thrilled to see it."
}
] | vqa_vcr |
63 | [
{
"from": "user",
"value": ". For the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2364096157.jpg</img>\n, give a description."
},
{
"from": "assistant",
"value": "A dog runs on the beach with a red dog toy with a yellow rope attached ."
}
] | caption_f30k |
64 | [
{
"from": "user",
"value": "Mark is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Reader/ESkmmFHikgg@34_0.jpg</img>\nMegan is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Reader/ESkmmFHikgg@34_1.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_The_Reader/[email protected]</img>\nIs Megan paying attention to Mark?"
},
{
"from": "assistant",
"value": "Yes, Megan is paying attention to Mark."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "Megan is looking directly at Mark who seems to be talking to Megan."
}
] | vqa_vcr |
65 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000237286_1_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000237286.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(419,240),(902,584)</box>"
}
] | ground_ref |
66 | [
{
"from": "user",
"value": "Given some people characters, Ryan is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/76445/Ryan.jpg</img>\n, Brian is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/76445/Brian.jpg</img>\n. Explain the content of this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/327520898.jpg</img>\n."
},
{
"from": "assistant",
"value": "Ryan and Brian bagging merchandise in bright orange plastic bags ."
}
] | caption_f30k |
67 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1052_Harry_Potter_and_the_order_of_phoenix/1052_Harry_Potter_and_the_order_of_phoenix_00.57.40.000-00.57.46.094@1.jpg</img>\nthere are some object or person images. Mark: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1052_Harry_Potter_and_the_order_of_phoenix/1052_Harry_Potter_and_the_order_of_phoenix_00.57.40.000-00.57.46.094@1_0.jpg</img>\nHow is Mark feeling?"
},
{
"from": "assistant",
"value": "Mark is feeling anxious."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "Mark is looking back as they are walking away."
}
] | vqa_vcr |
68 | [
{
"from": "user",
"value": "Sarah is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0011_Gandhi/0011_Gandhi_00.21.22.262-00.21.22.557@0_0.jpg</img>\nHelen is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0011_Gandhi/0011_Gandhi_00.21.22.262-00.21.22.557@0_1.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0011_Gandhi/[email protected]</img>\nWhy are Sarah and Helen meeting?"
},
{
"from": "assistant",
"value": "There is a diplomatic issue that needs attention."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "Usually when diplomats meet, it is a sign of goodwill from each country - and a way to mediate and compromise a solution peacefully."
}
] | vqa_vcr |
69 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000447724_22823_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000447724.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(300,98),(490,496)</box>"
}
] | ground_ref |
70 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1057_Seven_pounds/[email protected]</img>\nthere are some object or person images. Melissa: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1057_Seven_pounds/1057_Seven_pounds_00.32.14.833-00.32.17.244@0_0.jpg</img>\nwhat is Melissa going to do next?"
},
{
"from": "assistant",
"value": "Melissa is going to open her eyes and smile."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "she is trying to make him laugh"
}
] | vqa_vcr |
71 | [
{
"from": "user",
"value": "Virginia is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0023_THE_BUTTERFLY_EFFECT/0023_THE_BUTTERFLY_EFFECT_01.37.29.412-01.37.36.547@2_5.jpg</img>\nimage 1 is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0023_THE_BUTTERFLY_EFFECT/0023_THE_BUTTERFLY_EFFECT_01.37.29.412-01.37.36.547@2_9.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_0023_THE_BUTTERFLY_EFFECT/[email protected]</img>\nWhat is Virginia doing?"
},
{
"from": "assistant",
"value": "She is sitting on image 1."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "She isn't turning to walk away and is in the middle of sitting down."
}
] | vqa_vcr |
72 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/gqa/images/2362774.jpg</img>\n\nDoes the truck to the right of the woman look blue and large?\nAnswer the question using a single word or phrase."
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "The truck that is crossing the street is of what color?"
},
{
"from": "assistant",
"value": "Blue"
},
{
"from": "user",
"value": "Who is wearing trousers?"
},
{
"from": "assistant",
"value": "Woman"
},
{
"from": "user",
"value": "Do you see any cars to the left of the truck that is crossing the street?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "What kind of vehicle is to the left of the woman that is wearing a shirt?"
},
{
"from": "assistant",
"value": "Car"
},
{
"from": "user",
"value": "Are there either any women or people that are walking?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "What is the color of the sign above the sign?"
},
{
"from": "assistant",
"value": "Red"
},
{
"from": "user",
"value": "Which kind of vehicle is on the street?"
},
{
"from": "assistant",
"value": "Truck"
},
{
"from": "user",
"value": "What is the vehicle on the street the woman is crossing?"
},
{
"from": "assistant",
"value": "Truck"
},
{
"from": "user",
"value": "Who is wearing a shirt?"
},
{
"from": "assistant",
"value": "Woman"
},
{
"from": "user",
"value": "Does the truck to the left of the other truck look large and red?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "How big is the blue truck?"
},
{
"from": "assistant",
"value": "Large"
},
{
"from": "user",
"value": "What is the vehicle on the street that the woman is crossing?"
},
{
"from": "assistant",
"value": "Truck"
},
{
"from": "user",
"value": "What vehicle is on the street?"
},
{
"from": "assistant",
"value": "Truck"
},
{
"from": "user",
"value": "Is the blue truck to the right or to the left of the woman that is wearing trousers?"
},
{
"from": "assistant",
"value": "Right"
},
{
"from": "user",
"value": "What is the name of the sign above the sign that looks yellow?"
},
{
"from": "assistant",
"value": "Stop sign"
},
{
"from": "user",
"value": "Is the woman to the left of the truck wearing jeans?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "Is the car on the right side or on the left?"
},
{
"from": "assistant",
"value": "Left"
},
{
"from": "user",
"value": "What vehicle is crossing the street?"
},
{
"from": "assistant",
"value": "Truck"
},
{
"from": "user",
"value": "Is the car on the left?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "Are there women or catchers in this picture?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "Is there any blue truck or train?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "Does the truck to the left of the other truck have white color?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "Is the truck to the left of the other truck small or large?"
},
{
"from": "assistant",
"value": "Large"
}
] | llava |
73 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_One_Direction_The_Inside_Story/[email protected]</img>\nthere are some object or person images. Amanda: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_One_Direction_The_Inside_Story/9ZmqIpTPFmM@5_0.jpg</img>\nWhat is Amanda doing?"
},
{
"from": "assistant",
"value": "Amanda is getting ready to walk to an event."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Amanda is walking down a red carpet to some kind of event."
}
] | vqa_vcr |
74 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3019_COLOMBIANA/[email protected]</img>\nthere are some object or person images. Mark: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3019_COLOMBIANA/3019_COLOMBIANA_01.34.02.046-01.34.08.727@1_0.jpg</img>\nDiane: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3019_COLOMBIANA/3019_COLOMBIANA_01.34.02.046-01.34.08.727@1_2.jpg</img>\nWhy is Diane hitting Mark?"
},
{
"from": "assistant",
"value": "She is trying to get away from him."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "He looks skeevy and dangerous - he was probably trying to attack or kidnap her."
}
] | vqa_vcr |
75 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1041_This_is_40/[email protected]</img>\nthere are some object or person images. Nicholas: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1041_This_is_40/1041_This_is_40_01.41.13.119-01.41.14.546@0_0.jpg</img>\nSusan: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_1041_This_is_40/1041_This_is_40_01.41.13.119-01.41.14.546@0_2.jpg</img>\nis person Susan ignoring person Nicholas?"
},
{
"from": "assistant",
"value": "person Susan is ignoring person Nicholas"
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "person Susan is looking away as to not make eye contact."
}
] | vqa_vcr |
76 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000037748_0_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000037748.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(352,365),(958,650)</box>"
}
] | ground_ref |
77 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000172376_584330_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000172376.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(0,359),(146,956)</box>"
}
] | ground_ref |
78 | [
{
"from": "user",
"value": "Given some people characters, Amber is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/11937/Amber.jpg</img>\n. Please describe the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/370197474.jpg</img>\n."
},
{
"from": "assistant",
"value": "Amber wearing a black jacket and blue jeans sculpts a bird out of a carrot ."
}
] | caption_f30k |
79 | [
{
"from": "user",
"value": "Given some people characters, Nancy is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/70578/Nancy.jpg</img>\n. Please describe the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2796930150.jpg</img>\n."
},
{
"from": "assistant",
"value": "Nancy sits in a chair with a fishing pole next to water ."
}
] | caption_f30k |
80 | [
{
"from": "user",
"value": "Given some people characters, Sharon is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/106381/Sharon.jpg</img>\n, Victoria is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/106381/Victoria.jpg</img>\n, Susan is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/106381/Susan.jpg</img>\n. Analyze the contents of the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2428088070.jpg</img>\n."
},
{
"from": "assistant",
"value": "Sharon in a red shirt and Victoria and Susan sit around a table eating and drinking ."
}
] | caption_f30k |
81 | [
{
"from": "user",
"value": "David is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Invictus/3m1JShNLCYA@0_0.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Invictus/[email protected]</img>\nWhat would happen if David leaned over in this position the whole time he was having his tea?"
},
{
"from": "assistant",
"value": "David's back would be sore."
},
{
"from": "user",
"value": "give a reason for the answer"
},
{
"from": "assistant",
"value": "David is a very old man, and his back is probably very sensitive to discomfort."
}
] | vqa_vcr |
82 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/gqa/images/2408130.jpg</img>\n\nWhat is the appliance in the cabinets made out of wood?\nAnswer the question using a single word or phrase."
},
{
"from": "assistant",
"value": "Dishwasher"
},
{
"from": "user",
"value": "What's the dishwasher in?"
},
{
"from": "assistant",
"value": "Cabinets"
},
{
"from": "user",
"value": "What is the dishwasher in?"
},
{
"from": "assistant",
"value": "Cabinets"
},
{
"from": "user",
"value": "Do you see any drawers under the kitchen sink?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "Which kind of furniture is under the sink?"
},
{
"from": "assistant",
"value": "Cabinet"
},
{
"from": "user",
"value": "Which kind of furniture is made of wood?"
},
{
"from": "assistant",
"value": "Cabinets"
},
{
"from": "user",
"value": "Which kind of furniture is wooden?"
},
{
"from": "assistant",
"value": "Cabinets"
},
{
"from": "user",
"value": "Which kind of furniture is brown?"
},
{
"from": "assistant",
"value": "Cabinets"
},
{
"from": "user",
"value": "What is the white appliance in the photo?"
},
{
"from": "assistant",
"value": "Dishwasher"
},
{
"from": "user",
"value": "What appliance is it?"
},
{
"from": "assistant",
"value": "Dishwasher"
},
{
"from": "user",
"value": "What appliance is presented in the photograph?"
},
{
"from": "assistant",
"value": "Dishwasher"
},
{
"from": "user",
"value": "What kind of appliance is white?"
},
{
"from": "assistant",
"value": "Dishwasher"
},
{
"from": "user",
"value": "Is there a nightstand near the appliance on the left?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "On which side is the pot?"
},
{
"from": "assistant",
"value": "Right"
},
{
"from": "user",
"value": "Are there any pots on the new counter in the kitchen?"
},
{
"from": "assistant",
"value": "Yes"
},
{
"from": "user",
"value": "Which side of the image is the black device on?"
},
{
"from": "assistant",
"value": "Left"
},
{
"from": "user",
"value": "What device is sitting on the counter?"
},
{
"from": "assistant",
"value": "Phone"
},
{
"from": "user",
"value": "What is the item of furniture under the steel sink?"
},
{
"from": "assistant",
"value": "Cabinet"
},
{
"from": "user",
"value": "Does the kitchen counter look white and new?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "Is that counter old?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "What's the plant in?"
},
{
"from": "assistant",
"value": "Pot"
},
{
"from": "user",
"value": "Are there white phones in the photo?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "Are there either any towels or stoves in this photograph?"
},
{
"from": "assistant",
"value": "No"
},
{
"from": "user",
"value": "On which side of the photo is the dishwasher?"
},
{
"from": "assistant",
"value": "Left"
},
{
"from": "user",
"value": "Which color does the pot on the counter have?"
},
{
"from": "assistant",
"value": "Gray"
},
{
"from": "user",
"value": "What do you think is the device that looks black sitting on?"
},
{
"from": "assistant",
"value": "Counter"
},
{
"from": "user",
"value": "What is the telephone on?"
},
{
"from": "assistant",
"value": "Counter"
},
{
"from": "user",
"value": "What kind of device is on the counter?"
},
{
"from": "assistant",
"value": "Phone"
},
{
"from": "user",
"value": "Is it an indoors or outdoors picture?"
},
{
"from": "assistant",
"value": "Indoors"
},
{
"from": "user",
"value": "What are the pieces of furniture that are made of wood called?"
},
{
"from": "assistant",
"value": "Cabinets"
},
{
"from": "user",
"value": "Are there cabinets under the kitchen sink?"
},
{
"from": "assistant",
"value": "Yes"
}
] | llava |
83 | [
{
"from": "user",
"value": "Given some people characters, Emily is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/122225/Emily.jpg</img>\n. Could you interpret the scene from this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/961189263.jpg</img>\n?"
},
{
"from": "assistant",
"value": "Emily is playing in the muddy water running along a grassy area ."
}
] | caption_f30k |
84 | [
{
"from": "user",
"value": "Given some people characters, Timothy is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/138252/Timothy.jpg</img>\n, Samuel is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/138252/Samuel.jpg</img>\n. Please describe the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3711030008.jpg</img>\n."
},
{
"from": "assistant",
"value": "Timothy and Samuel solicits her restaurants 's products on the street ."
}
] | caption_f30k |
85 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3022_DINNER_FOR_SCHMUCKS/[email protected]</img>\nthere are some object or person images. Kevin: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3022_DINNER_FOR_SCHMUCKS/3022_DINNER_FOR_SCHMUCKS_01.33.29.345-01.33.42.047@1_0.jpg</img>\nJoshua: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3022_DINNER_FOR_SCHMUCKS/3022_DINNER_FOR_SCHMUCKS_01.33.29.345-01.33.42.047@1_1.jpg</img>\nWhat will Kevin do next?"
},
{
"from": "assistant",
"value": "Kevin will offer to help Joshua because Joshua looks upset."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "People have empathy for others who look upset and often will offer to help."
}
] | vqa_vcr |
86 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000078196_359868_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000078196.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(439,87),(1000,696)</box>"
}
] | ground_ref |
87 | [
{
"from": "user",
"value": "Given some people characters, Tyler is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/84791/Tyler.jpg</img>\n, Sharon is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/84791/Sharon.jpg</img>\n, Lisa is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/84791/Lisa.jpg</img>\n. Can you provide a description of the image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/287257735.jpg</img>\n?"
},
{
"from": "assistant",
"value": "Tyler, Sharon and Lisa in blue shirts are selling shoes in an outside market ."
}
] | caption_f30k |
88 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000518100_2_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000518100.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(107,194),(393,744)</box>"
}
] | ground_ref |
89 | [
{
"from": "user",
"value": "Given some people characters, Scott is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/92849/Scott.jpg</img>\n, Elizabeth is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/92849/Elizabeth.jpg</img>\n. Tell me about the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3186687674.jpg</img>\n."
},
{
"from": "assistant",
"value": "Scott and Elizabeth are watching a video from a control room ."
}
] | caption_f30k |
90 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000084131_62631_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000084131.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(358,602),(676,909)</box>"
}
] | ground_ref |
91 | [
{
"from": "user",
"value": "Given some people characters, Sarah is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/120076/Sarah.jpg</img>\n, Jack is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/120076/Jack.jpg</img>\n. Analyze the contents of the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3030954495.jpg</img>\n."
},
{
"from": "assistant",
"value": "Sarah in a rainbow wig and colorful clothing shows something to Jack ."
}
] | caption_f30k |
92 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000144227_1368213_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000144227.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(585,552),(1000,912)</box>"
}
] | ground_ref |
93 | [
{
"from": "user",
"value": "In the image 0: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Tomorrow_Never_Dies/[email protected]</img>\nthere are some object or person images. Nancy: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Tomorrow_Never_Dies/NweXJhbk9P0@24_0.jpg</img>\nSandra: <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/movieclips_Tomorrow_Never_Dies/NweXJhbk9P0@24_1.jpg</img>\nWhy is Nancy pointing his gun at Sandra?"
},
{
"from": "assistant",
"value": "He wants to kill him."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Sandra looks cornered and Nancy looks like he's about to shoot."
}
] | vqa_vcr |
94 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000337452_11_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000337452.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(548,340),(810,862)</box>"
}
] | ground_ref |
95 | [
{
"from": "user",
"value": "Picture 1: <img>/mnt/bn/automl-aigc/yatai/data/refcoco/region_imgs/COCO_train2014_000000433723_495291_crop.jpg</img>\nPicture 2:<img>/mnt/bn/automl-aigc/yatai/data/coco/train2014/COCO_train2014_000000433723.jpg</img>\nIdentify the Picture 1 instance in the Picture 2 with grounding:"
},
{
"from": "assistant",
"value": "<ref>Picture 1</ref><box>(322,138),(625,789)</box>"
}
] | ground_ref |
96 | [
{
"from": "user",
"value": "Given some people characters, Jack is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/59119/Jack.jpg</img>\n, Linda is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/59119/Linda.jpg</img>\n, Laura is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/59119/Laura.jpg</img>\n. Please describe the picture <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/3234115903.jpg</img>\n."
},
{
"from": "assistant",
"value": "Jack, Linda and Laura playing a game similar to hockey , but with no ice ."
}
] | caption_f30k |
97 | [
{
"from": "user",
"value": "Given some people characters, Lisa is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/111795/Lisa.jpg</img>\n, Sarah is <img>/mnt/bn/automl-aigc/yatai/data/shikra/cwb_f30k_imgs/111795/Sarah.jpg</img>\n. Explain the content of this image <img>/mnt/bn/automl-aigc/yatai/data/f30k/flickr30k-images/2979998405.jpg</img>\n."
},
{
"from": "assistant",
"value": "Lisa and Sarah in costumes are on a stage with red curtains ."
}
] | caption_f30k |
98 | [
{
"from": "user",
"value": "<img>/mnt/bn/automl-aigc/yatai/data/LLaVA-finetune/coco/train2017/000000248300.jpg</img>\n\nWhat Nintendo video game character is on the left on top of the cake?\nA. wario\nB. luigi\nC. donkey kong\nD. mario\nAnswer with the option's letter from the given choices directly."
},
{
"from": "assistant",
"value": "B"
}
] | llava |
99 | [
{
"from": "user",
"value": "Emily is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3040_JULIE_AND_JULIA/3040_JULIE_AND_JULIA_00.11.10.621-00.11.18.855@1_1.jpg</img>\nRachel is <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3040_JULIE_AND_JULIA/3040_JULIE_AND_JULIA_00.11.10.621-00.11.18.855@1_4.jpg</img>\nIn the image <img>/mnt/bn/automl-aigc/yatai/data/vcr/vcr1images/lsmdc_3040_JULIE_AND_JULIA/[email protected]</img>\nWhy are Emily and Rachel both talking on cell phones?"
},
{
"from": "assistant",
"value": "Emily and Rachel are very busy people."
},
{
"from": "user",
"value": "explain why"
},
{
"from": "assistant",
"value": "Only someone who is very busy would be talking on a phone to someone else when they are at a table together."
}
] | vqa_vcr |
No dataset card yet