权利要求:
WHAT IS CLAIMED IS:
1. A vehicle, comprising:
one or more sensors (102, 103, 104, 105, 106) arranged on at least one of a dashboard (110), a roof (112), and a center console of the vehicle, or one or more image capturing devices (6) for capturing one or more images from a left side and a right side of the vehicle; an electronic control unit (ECU) (220) configured to communicate with the one or more sensors (102, 103, 104, 105, 106) or the one or more image capturing devices (6); and at least one of a morphing surface (236), a windshield display, and one or more displays (1, 2, 3, 4, 5, 900) configured to be controlled by the ECU (220).
2. A vehicle according to claim 1, comprising the one or more sensors (102, 103, 104, 105, 106), wherein the ECU (220) receives data from the one or more sensors (102, 103, 104, 105, 106) on one or more of a position of an occupant (100), a position of an object, a size of an object, or a shape of an object.
3. A vehicle according to claim 2, wherein the one or more sensors (102, 103, 104, 105, 106) are arranged under a black panel surface.
4. A vehicle according to any one of claims 2 and 3, wherein the one or more sensors (102, 103, 104, 105, 106) comprise at least one of a Time of Flight (ToF) sensor, a camera, an infrared (IR) sensor, a radar, an ultrasound, a capacitive sensor, a brightness sensor, and a LIDAR sensor.
5. A vehicle according to any one of claims 2-4, wherein the one or more sensors (102, 103, 104, 105, 106) comprises a plurality of sensors (102, 103, 104), and
a first sensor (104) of the plurality of sensors (102, 103, 104) is configured to detect a position of the occupant (100) to activate or deactivate a second sensor (102) or a third sensor (103) of the plurality of sensors (102, 103, 104).
6. A vehicle according to claim 5, wherein, in response to the first sensor (104) detecting that the position of the occupant (100) is outside a range of the second sensor (102) and inside a range of the third sensor (103), the third sensor (103) is activated, and
in response to the first sensor (104) detecting that the position of the occupant (100) is outside the range of the third sensor (103) and inside the range of the second sensor (102), the second sensor (102) is activated.
7. A vehicle according to any one of claims 5 and 6, wherein the second sensor (102) and the third sensor (103) are positioned on a left side and a right side of an instrument cluster display (120) at a distance ranging from about 15 cm above to about 15 cm below the instrument cluster display.
8. A vehicle according to any one of claims 2-7, wherein the one or more sensors (102, 103, 104, 105, 106) are configured to detect a position of at least one of a center of a face of the occupant (100), an outer edge of a head of the occupant (100), a shoulder of the occupant (100), and a head of the occupant (100) with respect to a shoulder of the occupant (100).
9. A vehicle according to any one of claims 2-8, wherein each of the one or more sensors (102, 103, 104, 105, 106) has a field of view of at least about 30 degrees and operates at a close range of at least about 5 cm.
10. A vehicle according to any one of claims 2-9, wherein the vehicle comprises the windshield display, and
content on the windshield display is configured to be moved depending on the data received from the one or more sensors (102, 103, 104, 105, 106).
11. A vehicle according to claim 10, wherein the content on the windshield display is configured to be moved to compensate for movement of eyes of the occupant (100) with respect to at least one of an icon on the windshield display or an external object outside the vehicle.
12. A vehicle according to any one of claims 2-11, wherein the one or more sensors (102, 103, 104, 105, 106) are configured to detect one or more of an identification, a drowsiness or fatigue, a distraction, a head orientation, an eye gaze, a facial expression, a gender classification, an age classification, a body type, a quantity, a hand gesture, a thumbs
up gesture, an open palm gesture, a fist or fist gesture, a grabbing of the object, a releasing of the object, a proximity, and a proximity to the object of or by the occupant (100).
13. A vehicle according to any one of claims 2-12, wherein the one or more sensors (102, 103, 104, 105, 106) are configured to detect one or more of a type of the object, a size of the object, an orientation of the object, and a position of the object.
14. A vehicle according to any one of claims 2-13, further comprising one or more of an air vent (231), dashboard lighting (232), switches (233), a smartphone (234), a cup holder (235), a door pocket (237), a door armrest (238), a center console (239), a trunk (240), a seat (241), a seat back (242), and a roof (243) which is configured to be controlled by the ECU (220).
15. A vehicle according to any one of claims 2-14, wherein the vehicle comprises the morphing surface (236),
the morphing surface (236) is configured to change shape in response to a signal transmitted from the ECU (220), and
the signal transmitted from the ECU (220) is initiated using the data received from the one or more sensors (102, 103, 104, 105, 106).
16. A vehicle according to claim 14, wherein the morphing surface (236) comprises one or more electromechanical actuators which are configured to change the shape of the morphing surface (236).
17. A vehicle according to any one of claims 2-16, wherein the ECU (220) is configured to initiate a self-test process to determine whether the one or more sensors (102,
103, 104, 105, 106) and the ECU (220) are operating properly,
in response to detecting a malfunction of any of the one or more sensors (102, 103,
104, 105, 106) and the ECU (220), the ECU (220) is configured to display the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction,
in response to no malfunction being detected, the ECU (220) is configured to read an input of a first sensor (105) of the one or more sensors (102, 103, 104, 105, 106) until an arm movement is detected by the first sensor (105),
in response to an arm movement being detected by the first sensor (105), the ECU (220) is configured to read a second sensor (102, 103, 104, 106) of the one or more sensors (102, 103, 104, 105, 106) to determine whether a recognized movement or gesture is performed, and
in response to determining that a recognized movement or gesture is performed, the ECU (220) is configured to transmit a signal to one or more vehicle components (232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243) based on the recognized movement or gesture.
18. A vehicle according to claim 17, wherein the one or more vehicle components (232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243) comprises the morphing surface (236) which is configured to change in shape based on the recognized movement or gesture.
19. A method of using a vehicle according to any one of claims 2-18, the method comprising:
initiating a self-test process to determine whether the one or more sensors (102, 103, 104, 105, 106) and the ECU (220) are operating properly;
displaying the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction in response to detecting a malfunction of any of the one or more sensors (102, 103, 104, 105, 106) and the ECU (220);
reading an input of a first sensor (105) of the one or more sensors (102, 103, 104, 105, 106) until an arm movement is detected by the first sensor (105) in response to no
malfunction being detected;
reading a second sensor (102, 103, 104, 106) of the one or more sensors (102, 103, 104, 105, 106) to determine whether a recognized movement or gesture is performed in response to an arm movement being detected by the first sensor (105); and
transmitting a signal to one or more vehicle components (232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243) based on the recognized movement or gesture in response to determining that a recognized movement or gesture is performed.
20. A method according to claim 19, wherein the one or more vehicle components (232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243) comprises a morphing surface (236), and the method further comprises changing a shape of the morphing surface (236) based on the recognized movement or gesture.
21. A vehicle according to claim 1, comprising the one or more image capturing devices (6) for capturing one or images from a left side and a right side of the vehicle, and the one or more displays (1, 2, 3, 4, 5, 900),
wherein the ECU (220) is configured to control display of the one or more captured images on the one or more displays (1, 2, 3, 4, 5, 900).
22. A vehicle according to claim 21, wherein the one or more displays (1, 2, 3, 4, 5, 900) comprise one or more of a camera monitoring system (CMS) dedicated displays (1,
2), a center display (3), a dashboard display (4), a door display (5), and a rear view display (900).
23. A vehicle according to claim 22, wherein a left side image captured by the one or more image capturing devices (6) is configured to be displayed on a first CMS dedicated display (1), a right side image captured by the one or more image capturing devices (6) is configured to be displayed on a second CMS dedicated display (2), and the left side image and the right side image are configured to be stitched together and displayed on at least one of the center display (3), the dashboard display (4), and the door display (5).
24. A vehicle according to any one of claims 21-23, wherein the ECU (220) is configured to initiate a self-test process to determine whether the one or more displays (1, 2, 3, 4, 5, 900), the one or more image capturing devices (6), and the ECU (220) are operating properly,
in response to detecting a malfunction of any of the one or more displays (1, 2, 3, 4, 5, 900), the one or more image capturing devices (6), and the ECU (220), the ECU (220) is configured to display the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction,
in response to no malfunction being detected, the ECU (220) is configured to read or receive vehicle data to evaluate driving conditions,
in response to receiving vehicle data, the ECU (220) is configured to determine whether to use a comfort display or a full display,
in response to determining to use the full display, the ECU (220) is configured to process the one or more captured images and display the full display on the one or more displays (1, 2, 3, 4, 5, 900), and
in response to determining to use the comfort display, the ECU (220) is configured to process the one or more captured images and display comfort display on the one or more displays (1, 2, 3, 4, 5, 900).
25. A vehicle according to claim 24, wherein the ECU (220) is configured to determine whether to use the comfort display or the full display using one or more of a vehicle speed, a number of lanes on a road, a current driving lane, a time of driving, a blind spot detection, a vehicle acceleration, and a vehicle brake activation.
26. A vehicle according to any one of claims 24 and 25, wherein the ECU (220) is configured to determine to use the full display in response to a vehicle speed being less than 6 km/h.
27. A vehicle according to any one of claims 24-26, wherein the ECU (220) is configured to determine to use the comfort display in response to a vehicle speed being greater than or equal to 6 km/h, the vehicle speed increase or decreases by 10 km/h for a duration of one minute, only one lane being detected, a blind spot warning being off, and a vehicle brake being less than 0.1 g.
28. A vehicle according to any one of claims 24-27, wherein the ECU (220) is configured to determine to use the comfort display in response to a vehicle speed being greater than or equal to 6 km/h and more than one lane being detected.
29. A vehicle according to any one of claims 24-28, wherein the ECU (220) is configured to determine to use the full display in response to a vehicle break being greater than 0.1 g.
30. A vehicle according to any one of claims 24-29, wherein the ECU (220) is configured to determine to use the full display in response to a blind spot warning being on.
31. A vehicle according to any one of claims 24-30, wherein the full display comprises a separate display of a left side image captured by the one or more image capturing devices (6) and a right side image captured by the one or more image capturing devices (6),
and the comfort display comprises a stitched image of the left side image and the right side image on a single one of the one or more displays (1, 2, 3, 4, 5, 900).
32. A method of using a vehicle according to any one of claims 21-31, the method comprising:
initiating a self-test process to determine whether the one or more displays (1, 2, 3, 4, 5, 900), the one or more image capturing devices (6), and the ECU (220) are operating properly;
displaying the detected malfunction or initialize a programmed malfunction protocol to self-cure the detected malfunction in response to detecting a malfunction of any of the one or more displays (1, 2, 3, 4, 5, 900), the one or more image capturing devices (6), and the ECU (220);
reading or receiving vehicle data using the ECU (220) to evaluate driving conditions in response to no malfunction being detected;
determining whether to use a comfort display or a full display using the ECU (220) in response to receiving vehicle data;
processing the one or more captured images and displaying the full display on the one or more displays (1, 2, 3, 4, 5, 900) in response to determining to use the full display; and processing the one or more captured images and displaying the comfort display on the one or more displays (1, 2, 3, 4, 5, 900) in response to determining to use the comfort display.
33. A method according to claim 32, further comprising determining, using the ECU (220), whether to use the comfort display or the full display using one or more of a vehicle speed, a number of lanes on a road, a current driving lane, a time of driving, a blind spot detection, a vehicle acceleration, and a vehicle brake activation.
34. A method according to any one of claims 32 and 33, further comprising determining to use the full display in response to a vehicle speed being less than 6 km/h.
35. A method according to any one of claims 32-34, further comprising determining to use the comfort display in response to a vehicle speed being greater than or equal to 6 km/h, the vehicle speed increase or decreases by 10 km/h for a duration of one minute, only one lane being detected, a blind spot warning being off, and a vehicle brake being less than 0.1 g.
36. A method according to any one of claims 32-35, further comprising determining to use the comfort display in response to a vehicle speed being greater than or equal to 6 km/h and more than one lane being detected.
37. A method according to any one of claims 32-36, further comprising determining to use the full display in response to a vehicle break being greater than 0.1 g.
38. A method according to any one of claims 32-37, further comprising determining to use the full display in response to a blind spot warning being on.
39. A method according to any one of claims 32-38, wherein the displaying of the full display comprises displaying a separate display of a left side image captured by the one or more image capturing devices (6) and a right side image captured by the one or more image capturing devices (6), and the displaying of the comfort display comprises displaying a stitched image of the left side image and the right side image on a single one of the one or more displays (1, 2, 3, 4, 5, 900).