Patent classifications
H04N5/935
Imaging device
An imaging device includes an imaging sensor that outputs an imaging signal representing a sequence of frame images of a photographic subject. A buffer memory temporarily stores data of the sequence of frame images from the imaging signal. A release switch is actuated by a user to output an image-taking signal. A controller, upon receipt of the image-taking signal from the release switch: (i) generates moving image data from at least some of the plurality of frame images stored in the buffer memory, (ii) generates at least one piece of still image data based on at least one frame image of the plurality of frame images stored in the buffer memory, and (iii) associates the moving image data with the still image data and records the moving image data and the still image data in a recording medium.
Method and system for recording and synchronizing audio and video signals and audio and video recording and synchronization system
There is set forth a method and a system for recording and synchronizing audio and video signals. The audio signal and the video signal are stored together with time stamps from a respective associated system clock. The invention relates to an adaptation of the duration of the recorded audio sequence to the duration of an associated video sequence in order to level out differences in synchronization of the two system clocks. Alignment of the two system clocks is also introduced, which is based on a data transfer which has variable waiting times for the access to a transmission channel. This thus permits clock alignment with means as are available for example on a smartphone.
Fleet dashcam system for event-based scenario generation
Techniques for receiving and processing sensor data captured by a fleet of vehicle are discussed herein. In some examples, a fleet dashcam system can receive sensor data captured by electronic devices on a fleet of vehicles and can use that data to detect collision and near-collision events. The data of the collision or near-collision event can be used to determine a simulation scenario and a response of an autonomous vehicle control to the simulation scenario and/or it can be used to create a collision heat map to aid in operation of an autonomous vehicle.
Fleet dashcam system for event-based scenario generation
Techniques for receiving and processing sensor data captured by a fleet of vehicle are discussed herein. In some examples, a fleet dashcam system can receive sensor data captured by electronic devices on a fleet of vehicles and can use that data to detect collision and near-collision events. The data of the collision or near-collision event can be used to determine a simulation scenario and a response of an autonomous vehicle control to the simulation scenario and/or it can be used to create a collision heat map to aid in operation of an autonomous vehicle.
Method of timebase management for MPEG decoding with personal video recording functionality
Disclosed are various embodiments that facilitate recording to a storage medium in a personal video recorder (PVR) system. In one embodiment, a transport stream is received. The transport stream is stored into a memory. An index table is generated that provides information for locating particular frames recorded in the memory.
Method of timebase management for MPEG decoding with personal video recording functionality
Disclosed are various embodiments that facilitate recording to a storage medium in a personal video recorder (PVR) system. In one embodiment, a transport stream is received. The transport stream is stored into a memory. An index table is generated that provides information for locating particular frames recorded in the memory.
Localized audio source extraction from video recordings
Technologies are generally described for a system to process a collection of video recordings of a scene to extract and localize audio sources for the audio data. According to some examples, video recordings captured by mobile devices from different perspectives may be uploaded to a central database. Video segments capturing an overlapping portion of the scene at an overlapping time may be identified, and a relative location of each of the video capturing devices may be determined. Audio data for the video segments may be indexed with a sub-frame time reference and relative locations as a function of overlapping time. Using the indices that include the sub-frame time references and relative locations, audio sources for the audio data may be extracted and localized. The extracted audio sources may be transcribed and indexed to enable searching, and may be added back to each video recording as a separate audio channel.
Method and infrastructure for synchronized streaming of content
Systems and methods for synchronizing the playback of network media across multiple content playback devices, termed herein as “playback devices”, “clients”, or “client devices”. In one implementation, client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device. In other implementations, device lag times may be measured. In still other implementations, a master device may synchronize playback of media content on slave devices. In yet other implementations, devices may buffer and join playback of media content occurring on other devices. In further implementations, the systems and methods may be expanded to include steps of processing authentication for service providers prior to arranging synchronized playback.
Method and infrastructure for synchronized streaming of content
Systems and methods for synchronizing the playback of network media across multiple content playback devices, termed herein as “playback devices”, “clients”, or “client devices”. In one implementation, client devices are controlled to parse and buffer media content separately. Once all clients are ready, a controller may cause the client devices to start in a synchronized fashion based on signals sent by the controller. The controller adjusts the timing of the signal so that the outputs are displayed in synchronization on each client device. In other implementations, device lag times may be measured. In still other implementations, a master device may synchronize playback of media content on slave devices. In yet other implementations, devices may buffer and join playback of media content occurring on other devices. In further implementations, the systems and methods may be expanded to include steps of processing authentication for service providers prior to arranging synchronized playback.
Systems and methods for recording haptic data for use with multi-media data
A system includes a recorder configured to record audio and/or video of a subject of interest and output a recording of the subject of interest and a non-contact sensor associated with the recorder. The non-contact sensor is constructed and arranged to measure movement and/or vibration of the subject of interest from substantially the same perspective and at the same time as the recorder. The system includes a controller configured to transform the measured movement and/or vibration of the subject of interest measured by the non-contact sensor into a tactile data stream for sending to a haptic display device for playback with the recording of the subject of interest by the recorder and providing haptic effects corresponding to the measured movement and/or vibration to a user of the haptic display device in synchronization with the recording.