PRODUCTS

Contact

E-mail:sales@coolifegroup.com

Phone:15814041681

Tel:15814041681

Add:Block 3,Wujiyungu,Rongxing Road,Xinci Street,Pingshan District, Shenzhen, China.

CL-IT-003: Intelligent Connected Sightseeing Vehicle

Category:CL-IT-003:Sightseeing Mini Bus

CL-IT-003: Intelligent Connected Sightseeing VehicleI.Product OverviewThe autonomous shuttle vehicle showcases a fully self-developed autonomous driving solution, integrating various sensor technologi···

Contact

CL-IT-003: Intelligent Connected Sightseeing Vehicle

 

I. Product Overview

The autonomous shuttle vehicle showcases a fully self-developed autonomous driving solution, integrating various sensor technologies such as cameras, LiDAR, millimeter-wave radar, Ultrasonic Sensor, and GPS/IMU. Relying on advanced perception algorithms and behavior prediction technology, the vehicle achieves precise obstacle detection and response. It uses a strategy primarily based on laser positioning, supplemented by RTK positioning, optimizing location accuracy in complex environments and ensuring stable operation of the autonomous vehicle in the event of signal interference.

To adapt to mixed traffic conditions of pedestrians and vehicles, the shuttle is equipped with a planning and control unit designed specifically for autonomous vehicles. This unit integrates multi-sensor fusion technology, supporting L3/L4 level autonomous driving requirements. This integrated computing platform is capable of processing inputs from multiple cameras, LiDAR point cloud data, millimeter-wave radar signals, and Ultrasonic Sensor information. Additionally, the built-in IMU enhances the vehicle's positioning and navigation accuracy, while capabilities in target fusion, combined positioning, and decision-planning processing further optimize the vehicle's autonomous driving performance.

The autonomous shuttle vehicle not only supports key autonomous driving functions such as automatic tracking, obstacle identification, automatic braking, station docking, local path planning, and automatic parking but also provides comprehensive services including system setting, calibration, fault diagnosis, and software upgrades. Its autonomous driving algorithm software system consists of perception, fusion, planning, and control modules, which are modularly designed and interconnected through API interfaces. This provides educational institutions with high openness and flexibility, allowing users to customize functional modules to meet various training and teaching needs.

I. Feature Overview

The autonomous shuttle vehicle integrates advanced autonomous driving technology and a multi-sensor fusion computing platform, designed to meet L3/L4 level autonomous driving needs, with the following main functions and features:

1. Multi-Camera and Visual Recognition: Supports inputs from multiple cameras, with robust visual recognition and automatic parking data processing capabilities.

2. LiDAR Point Cloud Processing: Capable of processing LiDAR-generated point cloud data, supporting precise environmental perception and obstacle identification.

3. Millimeter-Wave Radar Data Processing: Integrates inputs from multiple millimeter-wave radars, providing high-precision moving target detection and tracking.

4. Ultrasonic Sensor Application: Equipped with 12 Ultrasonic Sensors, enhancing short-range obstacle detection and avoidance capabilities.

5. IMU Integration: Built-in Inertial Measurement Unit (IMU) enhances the vehicle's motion state monitoring and precise positioning.

6. Target Fusion and Positioning: Achieves target fusion and combined positioning functions, improving the accuracy and reliability of decision planning.

7. Vehicle Data Management: Supports vehicle data access and complex data processing, providing support for autonomous driving decisions.

8. Multi-Channel Vehicle Control: Features multi-channel control buses, meeting complex vehicle control requirements.

9. System Setting and Calibration: Provides system setting and calibration functions, optimizing autonomous driving system performance.

10. Fault Diagnosis: Built-in fault diagnosis function, ensuring stable system operation.

11. System and Software Upgrades: Supports system and software upgrades, ensuring continuous technological progress and compatibility.

The autonomous driving algorithm software system uses a modular design, including perception, fusion, planning, control, and a full set of software, with API interfaces for inter-module communication, forming a complete autonomous driving software system. The vehicle offers open API interfaces, allowing educational institutions to independently develop and verify related functional modules, providing an experimental platform for teaching and research in autonomous driving technology, deepening students' understanding and application skills in autonomous driving technology.

II. Vehicle Configuration

1. By-Wire Shuttle (Parameters)

2. Computing Unit

· CPU: 6 cores 12 threads, base frequency 2.9GHz, 12M L3 cache, ensuring powerful data processing capabilities.

· GPU: Independent image processor, CUDA cores 3584, memory frequency 15Gbps, capacity 12G DDR6, supporting complex image processing tasks.

· Memory: 16GB LPDDR4x 2666MHz, ensuring smooth system operation.

· Storage: 500GB solid-state drive, providing ample data storage space.

· Interfaces: Equipped with Gigabit Ethernet + WiFi, USB3.0, meeting diverse communication needs.

Front Camera

· Model: Sensor IMX291, lens Size 1/2.8, providing clear image capture.

· Interface: USB3.0, ensuring high-speed data transfer.

· Maximum Effective Pixels: 2 million, resolution 1920x1080, supporting high-definition video capture.

· Output Format: MJPEG/YUV2(YUVY), meeting different image processing needs.

· Maximum Frame Rate: 50 frames/YUV/MJPEG, ensuring smooth video playback.

· Detection Targets: Vehicles, pedestrians, traffic signs, traffic lights, etc., enhancing environmental perception capabilities.

4. 16-Line LiDAR

· Scanning Channels: 16 lines, providing 360-degree spatial scanning.

· Laser Wavelength: 905nm, suitable for various detection environments.

· Detection Distance: 70 meters to 200 meters, meeting long-range sensing needs.

· Power Supply Range: 9V-36VDC, adaptable to different power systems.

· Communication Interface: Ethernet pps, ensuring stable data transmission.

· Data: Includes three-dimensional spatial coordinates and point cloud reflectivity, supporting precise environmental modeling.

Combined Positioning Unit

· Positioning Modes: Supports RTK, GNSS single point, and triple-mode seven-frequency (GPS, BDS, GLANESS).

· Built-in: 6-axis IMU, providing accurate posture recognition.

· Attitude Accuracy: 0.1° (baseline length 2m), ensuring high-precision positioning.

· Positioning Accuracy: Single point L1/L2 at 1.2m, DGPS at 0.4m, RTK at 1cm+1ppm, meeting high-precision navigation requirements.

· Input Voltage: 932V DC (standard adapted to 12V DC), power consumption

Millimeter-Wave Radar

· Operating Frequency: 76GHz to 77GHz, suited for high-precision long-range detection.

· Detection Distance: 0.2m to 250m, covering a wide range of distances.

· Distance Resolution and Accuracy: Close range ±0.39m, long range ±1.79m; close range accuracy ±0.10m, long range accuracy ±0.40m.

· Speed Range: -400 km/h to +200 km/h, supporting the detection of high-speed moving targets.

· Speed Resolution and Accuracy: Long range 0.37km/h, close range 0.43km/h; speed accuracy ±0.1 km/h.

· Detection Targets: Includes targets moving away, approaching, stationary, and crossing.

· Data Output: Supports CAN/CANFD, providing target ID, distance, speed, and radar cross-section (RCS).

· Working Environment: Temperature -40to 85, voltage 9-16V, protection level IP67.

Ultrasonic Sensor

· Power Supply: +12V to 24V.

· Operating Temperature: -40to +85.

· Measuring Range: 130mm to 5000mm, adjustable measuring distance.

· Accuracy: 0.5% of the detection distance.

· Resolution: 5mm.

· Communication Interface: Compatible with CAN2.0A and CAN2.0B.

· Sampling Rate: 100ms, with a probe emission angle of 60 degrees.

III. Vehicle Function Description

1. Complete Autonomous Driving System: The vehicle is equipped with a complete autonomous driving system, ensuring it can operate normally under the management of the system.

2. L4 Autonomous Driving Capabilities: Based on high-precision maps, the system can perform L4-level autonomous driving functions, including but not limited to active path tracking, obstacle identification and avoidance, autonomous braking, precise station docking, and dynamic local path planning.

3. Adjustable Driving Parameters: Provides a user-friendly interface for driving parameter settings, allowing users to adjust the autonomous driving system's strategy according to actual needs.

4. High-Precision Map Generation: The system integrates the function of generating high-precision maps, capable of recording and processing point cloud data, and creating detailed navigation maps in conjunction with map-making software.

5. Sensor Application Training Software: Provides dedicated training software for each sensor, supporting step-by-step functional teaching and technical training.

6. Multi-Technology Fusion Positioning: Combines a variety of advanced positioning technologies, supporting precise path tracking and navigation in indoor and outdoor environments.

IV. Supporting Software Description

1. Vision Testing Software:

· Includes modules for vehicle and pedestrian recognition, lane line and traffic light recognition.

· Supports rapid installation, calibration, debugging, data collection, and processing of cameras.

· Provides a complete development toolkit, including Python3, TensorFlow, CUDA, etc.

· Includes machine learning models, training samples, and a real-time processing DEMO, supplemented with a training dataset.

Radar Testing Software:

· Supports testing of millimeter-wave and Ultrasonic Sensors, including distance detection and target recognition.

· Real-time reception and analysis of radar data streams, applicable for target analysis under different working conditions.

· Provides the capability to read fault information.

LiDAR Testing Software:

· Supports interface testing and LiDAR configuration (including network settings, time synchronization, motor parameter adjustments, etc.).

· Real-time reception of LiDAR data streams, visualizing point cloud information.

Combined Navigation Testing Software:

· Includes interface testing and calibration of the combined navigation system (initial alignment, navigation modes, coordinate axis configuration, data output, etc.).

· Supports the reception and analysis of combined navigation data, offering the ability to read fault information.

The autonomous driving algorithm software system adopts a modular design, containing perception, fusion, planning, control, and a complete set of software. It implements communication between modules through API interfaces, forming an entire autonomous driving software ecosystem. The vehicle provides open API interfaces, allowing educational institutions to independently develop and verify related functional modules. This offers an experimental platform for teaching and research in autonomous driving technology and deepens students' understanding and application capabilities in autonomous driving technology.

 


  • Prev:no more
  • Next:no more
Related products