Experience ride representation apparatus and method for real-sense media service based on multi-vision
First Claim
1. An experience ride representation apparatus for a real-sense media service based on a multi-vision comprising a plurality of screen units for one or more users,wherein one of the plurality of screen units reflects a motion of a user in response to a first signal of the user,wherein a second signal is transmitted to the user based on real-sense media contents reproduced on the screen unit, andwherein metaverse contents are provided to the screen unit, and the real-sense media contents are provided to a remaining screen unit when the screen unit is shared by a plurality of users.
1 Assignment
0 Petitions
Accused Products
Abstract
Disclosed herein are an experience ride representation apparatus and method for a real-sense media service based on a multi-vision. An experience ride representation apparatus for a real-sense media service based on a multi-vision in which a motion of a user is reflected so as to move in a plurality of screen units depending on a first signal received from the user, a second signal is transmitted to the user based on real-sense media contents reproduced on the screen unit, and metaverse contents are provided to a plurality of users when the screen unit is shared by the plurality of users.
8 Citations
15 Claims
-
1. An experience ride representation apparatus for a real-sense media service based on a multi-vision comprising a plurality of screen units for one or more users,
wherein one of the plurality of screen units reflects a motion of a user in response to a first signal of the user, wherein a second signal is transmitted to the user based on real-sense media contents reproduced on the screen unit, and wherein metaverse contents are provided to the screen unit, and the real-sense media contents are provided to a remaining screen unit when the screen unit is shared by a plurality of users.
-
2. An experience ride representation apparatus for a real-sense media service based on a multi-vision, comprising:
-
a content server providing real-sense media contents and real-sense effect data corresponding to the real-sense media contents; a plurality of screen units reproducing the real-sense media contents; a real-sense device providing a real-sense effect to a user based on the real-sense effect data; a motion recognizing unit recognizing a motion of the user to generate motion information; a motion processing unit reflecting the motion of the user in the screen unit based on the motion information; a screen moving unit requesting the motion processing unit to reflect the motion of the user in the screen unit corresponding to a screen moving signal when the motion recognizing unit receives the screen moving signal from the user; and a metaverse server providing metaverse contents shared by a plurality of users and metaverse real-sense effect data corresponding to the metaverse contents in the case in which motions of the plurality of users are reflected in the screen unit, wherein metaverse contents are provided to the screen unit, and the real-sense media contents are provided to a remaining screen unit when motions of a plurality of users are reflected in the screen unit. - View Dependent Claims (3, 4, 5, 6, 7, 8, 9)
-
-
10. An experience ride representation method for a real-sense media service based on a multi-vision, comprising:
-
receiving a first signal from a user; reflecting a motion of the user in one of a plurality of screen units in response to the first signal; transmitting a second signal to the user based on real-sense media contents reproduced on the screen unit; and providing metaverse contents to the screen unit, and the real-sense media contents to a remaining screen unit when the screen unit is shared by a plurality of users.
-
-
11. An experience ride representation method for a real-sense media service based on a multi-vision, comprising:
-
providing, by a content server, real-sense media contents and real-sense effect data corresponding to the real-sense media contents to a screen unit; reproducing, by the screen unit, the real-sense media contents; providing, by a real-sense device, a real-sense effect to a user based on the real-sense effect data; recognizing, by a motion recognizing unit, a motion of the user to generate motion information; reflecting, by a motion processing unit, the motion of the user in the screen unit based on the motion information; requesting, by a screen moving unit, the motion processing unit to reflect the motion of the user in the screen unit corresponding to a screen moving signal when the motion recognizing unit receives the screen moving signal from the user; and providing, by a metaverse server, metaverse contents shared by a plurality of users and metaverse real-sense effect data corresponding to the metaverse contents in the case in which motions of the plurality of users are reflected in the screen unit, wherein metaverse contents are provided to the screen unit, and the real-sense media contents are provided to a remaining screen unit when motions of a plurality of users are reflected in the screen unit. - View Dependent Claims (12, 13, 14, 15)
-
Specification