Intellect-Partners

Categories
Automotive

LiDAR Technology in Autonomous Vehicles

Introduction:

LiDAR, an acronym for “light detection and ranging” or “laser imaging, detection, and ranging” is a sensor used for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. With the functionality of scanning its environment, it is also sometimes called 3D laser scanning. Particularly, LiDAR image registration (LIR) is a critical task that focuses on techniques of aligning or registering lidar point cloud data with corresponding images. It involves two types of data that have different properties and may be acquired from different sensors at different times or under different conditions. With an accurate alignment of LiDAR point clouds and captured 2D images, the registration method results in the most informative understanding of the environment with fine details.

How does LiDAR work?

The working methodology of LiDAR includes sending a pulse of light and waiting for the return. It measures the total time period i.e. how long it takes to return the pulse. This finally assists in figuring out the distance between objects.

LiDAR Sensor Representation for Autonomous Vehicle

Fig. 1. Working of LiDAR

Application Areas of LiDAR
The fusion of LiDAR point clouds and camera images is a popular example of Multi-Remote Sensing Image Registration (MRSIR). As of today, LiDAR is of various types and forms such as static and mobile LiDARs. According to the geographical use, LiDAR is of terrestrial, aerial, and marine kinds.
The application of LiDAR is very broad. It has uses in surveying, archaeology, geology, forestry, and other fields such as:

  • Autonomous driving: LIR is used to align sensor data to create a more accurate and complete representation of the environment.
  • Robotics: Align sensor data to create more accurate maps and enable more precise localization.
  • 3D mapping: Align data from multiple sensors to create detailed 3D models of the environment.
  • Augmented Reality (AR): Synchronizing virtual elements to correspond with the physical environment.

Utilization of LiDAR in Self-Driving Vehicles

3D Point Cloud and Calculation of Distance
In the realm of road safety, numerous automobile manufacturers are either using or exploring the installation of LiDAR technology in their vehicles.

LiDAR Technology in Self-Driving Vehicles

Fig. 1. LiDAR Technology in Self-Driving Vehicles [Source: https://velodynelidar.com/what-is-lidar/#:~:text=A%20typical%20lidar%20sensor%20emits,calculate%20the%20distance%20it%20traveled]

By iterating this process multiple times within seconds, a detailed, live 3D representation of the environment is generated, referred to as a point cloud.

Advantages of Mounting Lidar Above Autonomous Vehicles
Within an autonomous vehicle, the LiDAR sensor captures extensive data through rapid analysis of numerous laser pulses. This information, forming a ‘3D point cloud‘ from laser reflections, undergoes processing by an integrated computer to generate a dynamic three-dimensional representation of the surroundings. Training the onboard AI model with meticulously annotated point cloud datasets becomes pivotal to ensuring the precise creation of this 3D environment by LiDAR. The annotated data empowers autonomous vehicles to detect, identify, and categorize objects, enhancing their ability to accurately discern traffic lanes, road signs, and moving entities, and evaluate real-time traffic scenarios through image and video annotations.
Beyond research, active exploration delves into the use of LiDAR technology within autonomous vehicles. Automakers have begun integrating LiDAR into advanced driver assistance systems (ADAS), enabling a comprehensive grasp of dynamic traffic conditions. The journey toward autonomous driving safety relies on these systems, which swiftly make precise decisions through meticulous analysis of vast data points, ensuring security through rapid computations.

Cutting-edge approaches
However, there still are challenges in developing a fully automated vehicle with a guarantee of 100% accuracy in critical tasks such as object detection and navigation. To overcome this challenge, many researchers and automobile companies have been trying to improve this technology. The cutting-edge approaches include broadly categorized architecture of methodologies involving four distinct pipelines: information-based pipeline, feature-based pipeline, ego-motion-based pipeline, and deep learning-based pipeline. There has been more accuracy and improvement in the sector of deep learning-based pipelines. LiDAR technology not only enhances convenience but also plays a pivotal role in reducing severe collisions. The latest advancements in this domain include the innovation of LiDAR sensors and the shift from traditional mechanical methods to cutting-edge FMCW and flash technologies.

Patenting Trends for LiDAR Technology in Autonomous Vehicles

The field of autonomous vehicle technology has witnessed a notable rise in patent submissions, especially concerning sensor technology, mapping techniques, decision-making algorithms, and communication systems. Pioneering the advancements are entities such as Google, Tesla, and Uber, whereas longstanding automotive giants like Ford, General Motors, and BMW have also been actively filing patents. In the United States, a significant emphasis lies on artificial intelligence (AI) and augmented reality within the market, with car manufacturers and developers collaborating to introduce self-driving vehicles to the public. Autonomous cars are predicted to change the driving experience and introduce a whole new set of problems.
Despite Sartre’s initial patent submission in the autonomous vehicle domain, it was perceived primarily as a patent related to an AI system designed for highway navigation or restricted roadways. There was a scarcity of US patent filings for self-driving cars before 2006, largely influenced by a trend that emerged in the late 1990s and persists today: a limited number of patents granted by the US Patent Office.

Challenges in Patenting Technology for Autonomous Vehicles
The challenges in patenting technology for self-driving vehicles emerge when these vehicles are involved in incidents or insurance-related events. Owners typically confront three choices:

  1. Assuming liability for any harm or property damage caused by their vehicle.
  2. Taking steps toward legal recourse against the involved driver.
  3. Exploring compensation from their insurance company to address losses resulting from the other driver’s negligence.
    However, legislative uncertainty still clouds the landscape concerning autonomous vehicles and traffic incidents.

Analysis of Patent Applications filed under Lidar in Autonomous Vehicles
Over the past few years, there has been a rapid growth in filing Patent Applications regarding the use of LiDAR in Autonomous Vehicles. As of today, it is marked that there are ~81,697 patents recorded around the globe. It has been observed that Ford Global Tech LLC with ~3,426 patents is a dominant player in the market. Similarly, LG Electronics and Waymo LLC stand in second and third position in the chart.

Analysis of Patent Applications filed under Lidar in Autonomous Vehicles

[Source: https://www.lens.org/lens/search/patent/list?q=LiDAR%20%20%2B%20Autonomous%20vehicle]
The following visual representations show the charts representing Legal Status and Patent Documents Over Time.

Legal Status and Patent Documents Over Time.
Patent Documents Over Time

[Source: https://www.lens.org/lens/search/patent/list?q=LiDAR%20%20%2B%20Autonomous%20vehicle]

Through an examination of patent filings across different geographic regions, it is evident that the United States, constituting approximately 78% of the overall patents submitted, holds the foremost position in this chart.

patent filings across different geographic regions

[Source: https://www.lens.org/lens/search/patent/list?q=LiDAR%20%20%2B%20Autonomous%20vehicle]

Conclusion

In conclusion, LiDAR technology used in self-driving vehicles has a huge scope in improving road safety. With the cutting-edge FMCW and flash technologies, the application of LiDAR in autonomous vehicles shows great improvements in terms of accuracy and comfort providing features like object detection and incredible navigation. Automobile companies such as Tesla and Toyota have already practiced the technology in their vehicles and companies having such huge turnovers are seeking forward to utilize the full potential of the technology. Technology holds the future of global advancement in technology.

Categories
Computer Science

DDR5’s Secret Weapon: On-Die Termination (ODT) for Noise Reduction and Power Efficiency

Enhancing data reliability and performance: Exploring On-die termination (ODT) in DDR5 memory

Signal integrity is more important as data is delivered at faster speeds in DDR5 memory. When there is an imbalance between the characteristic impedance of the transmission line and the impedance of the connected devices, signal reflections may happen. DDR5 (Double Data Rate 5) memory modules and other high-speed digital systems use the on-die termination (ODT) technology to lessen signal reflections and enhance signal integrity.

By placing a termination resistor that matches the transmission line’s impedance right on the memory chip, on-die termination minimizes the possibility of signal reflections. Therefore, ODT is a crucial component for high-speed DDR5 memory systems since it aids in enhancing signal quality, decreasing signal ringing, and eventually allowing for higher data transfer speeds with less signal deterioration.  

To other circuity like RCV: DQ, DS, DM, TDQS

[Source: DDR5 Standard [JEDEC JESD79-5B_v1.20] Page 346 of 502]

P.S. You can refer to DDR5 Standard [JEDEC JESD79-5B_v1.20]: https://www.jedec.org/sites/default/files/docs/JESD79-5B_v1-2.pdf for further studies.     

With on-die termination (ODT), the termination resistor for transmission line impedance matching is housed inside a semiconductor chip as opposed to a printed circuit board (PCB). This termination resistor can be dynamically enabled or disabled depending on the settings of the memory controller and the particular needs of the memory bus.   

Types of On-Die Termination (ODT) in DDR5

There are two primary ODT implementation types in DDR5 memory:

ODT in parallel (PODT)

The conventional ODT technique used in earlier DDR memory generations is called Parallel On-Die Termination. The data lines on the memory chip are connected in parallel with a fixed termination resistor in PODT. Regardless of whether the ODT is activated or disabled, this resistor offers a constant impedance to the data lines. On a memory module, the termination value is commonly selected to match the characteristic impedance of the transmission lines.

Dynamic On-Die Termination (DODT)

It is a more sophisticated ODT technology that was introduced with DDR5 memory. When using DODT, the termination impedance can be changed dynamically, in contrast to PODT. According to the settings of the memory controller and the precise data transfer requirements at any given time, the termination resistor can be changed or turned on or off. With the aid of this dynamic management, signal integrity can be improved for a range of data rates and load situations.

PODT v. DODT

Parallel ODT:

  • The termination impedance in parallel ODT is constant and does not fluctuate.
  • In order to change between high and low termination impedances, a mode register set instruction is necessary.
  • The termination resistor is positioned on the motherboard in this example of the termination method.

Dynamic ODT:

  • The DRAM may flip between high and low termination impedance thanks to dynamic ODT without requiring a mode register set instruction.
  • It gives systems more freedom to choose the best termination values under various loading scenarios.
  • Without executing a mode register set instruction, it enables the DRAM to alternate between high and low termination impedance.
  • It simplifies and lowers the cost of the system design by reducing the amount of complicated wire and resistor parts on the motherboard.

In conclusion, the primary distinction between parallel ODT and dynamic ODT is that the former has a fixed termination impedance while the latter enables dynamic impedance switching without the requirement of a mode register set instruction. Increased flexibility and optimization for various loading circumstances are provided by dynamic ODT.

Key features of ODT in DDR5

Certainly! On-Die-Termination (ODT), which plays a critical part in guaranteeing dependable and effective high-speed data transmission, is particularly significant in DDR5 memory. ODT addresses several significant issues that develop as data transmission rates climb in contemporary memory systems. The following are the primary implications of ODT in DDR5:

Signal Reflection Reduction

Due to the nature of high-speed digital transmissions, signal reflections and impedance mismatches occur when data signals are carried across the memory bus. These reflections may deteriorate the quality of the delivered data and distort the signal. To lessen signal reflections and minimize data errors, ODT offers termination resistors that are directly attached to the memory chips and match the characteristic impedance of the transmission lines.

Data Reliability

Due to DDR5’s faster data transfer speeds, there is also a greater chance of data mistakes and corruption. Data distortions and signal ringing are reduced by proper termination utilizing ODT, resulting in more dependable data transfer and a lower probability of memory-related mistakes. ODT improves memory performance by allowing memory modules to run at their full specified speeds by reducing signal reflections and distortions.

Noise reduction

ODT aids in the memory system’s ability to filter out noise and electromagnetic interference (EMI). For signal quality to be maintained and to prevent data corruption or system instability, noise reduction is essential.

Power Efficiency

The Dynamic On-Die Termination (DODT) feature of DDR5 memory enables dynamic management of the termination impedance. DODT optimizes power usage by changing the termination parameters in accordance with the demands of the data transfer. The amount of unnecessary power dissipation is reduced, making the memory system more power-efficient.

Flexibility  

DODT provides more flexibility in memory operations because it is a dynamic implementation of ODT. Memory controllers offer superior adaptability to changing circumstances by adjusting termination settings for various memory configurations, data rates, and system loads.

Intellectual property trends for ODT

ODT in DDR5 is witnessing rapid growth in patent filing trends across the globe. Over the past few years, the number of patent applications almost getting doubled every two years.   

MICRON is a dominant player in the market with ~426 patents. So far, it has 2 times more patents than Intel. AMD is the third-largest patent holder in the domain.

Other key players who have filed for patents in DDR5 technology with ODT are SK Hynix, NVDIA, Samsung, IBM, Qualcomm and IBM.

Other key players who have filed for patents in DDR5 technology with ODT are SK Hynix, NVDIA, Samsung, IBM, Qualcomm and IBM

[Source: https://www.lens.org/lens/search/patent/list?q=on-die%20termination%20on%20DDR5%20memory]

Following are the trends of publication and their legal status over time:

[Source: https://www.lens.org/lens/search/patent/list?q=on-die%20termination%20on%20DDR5%20memory

These Top 10 companies own around 54% of total patents related to HBM. The below diagram shows these companies have built strong IPMoats in US jurisdiction, followed by China, European, Korea, and Germany jurisdiction.

[Source: https://www.lens.org/lens/search/patent/list?q=on-die%20termination%20on%20DDR5%20memory]

Conclusion

ODT is becoming more and more important as memory technologies develop. Strong signal integrity and effective data transmission become more crucial with each new memory generation and higher data rates. The use of ODT in DDR5 helps memory systems be prepared for future increases in performance and data transfer speeds. In conclusion, ODT helps to provide a stable and dependable memory system that can support the needs of contemporary computer applications by reducing signal reflections and noise.

Categories
Electronics

Wi-Fi Offloading: Boosting Connectivity, Saving Costs, and Easing Network Congestion

In an increasingly connected world, where our dependency on mobile devices and data use is rising, the demand for fast and dependable internet access is at an all-time high. But the study found that mobile networks frequently fail to keep up with increased demand, resulting in slower speeds, crowded networks, and disgruntled consumers.

To overcome this issue, WIFI offloading has emerged as a possible alternative. In this blog, we will look at the notion of WIFI offloading, its benefits, and how it works.

WIFI Offloading Understanding:

Wi-Fi offloading is the practice of using Wi-Fi hotspots to keep mobile devices connected. This can be done manually or by logging into a home or public Wi-Fi network. When a device moves from a cellular connection to Wi-Fi or small cell connectivity, such as when mobile traffic is offloaded to public hotspots.

WiFi offloading, or mobile data offloading, diverts cellular network traffic to WiFi networks, improving connectivity and reducing strain on mobile networks. This blog explores the benefits and mechanics of WiFi offloading.

Benefits of WiFi Offloading:

WiFi offloading offers several advantages.

  • It enhances connectivity by leveraging faster and more reliable WiFi networks, especially in areas with weak cellular signals.
  • It leads to cost savings by reducing mobile data consumption, as WiFi usage doesn’t count towards cellular data caps.
  • It reduces network congestion, improving overall network performance during peak usage. Finally, WiFi offloading can extend battery life on mobile devices, as transmitting data over WiFi is more energy-efficient.

How WiFi Offloading Works:

Mobile devices use network selection algorithms to determine the best connection when both cellular and WiFi options are available. Seamless handover ensures uninterrupted connectivity, as devices automatically switch from cellular to WiFi when a connection is available. Authentication protocols and security measures protect data while connected to WiFi networks.

If we speak in technical terms, WiFi offloading refers to a type of handover between a non-WiFi network and a WiFi network.

Mobile data offloading

Figure: 1. Mobile data offloading

Source: https://www.researchgate.net/figure/Description-of-Mobile-Data-Offloading_fig2_326030064

Let us look into Figure 1. This explains the offloading procedure, so assume that at time t, a mobile node (MN) seeks to initiate a data transfer session. While the cellular network is always presumed to be available, the WiFi network is only accessible when the MN is close enough to the WiFi coverage. The offloading technique employs a network selection algorithm based on Received Signal Strength (RSS).

Received Signal Strength: The Received Signal Strength (RSS) informs the receiver about the strength of the received signal, which represents the power of the signal at the receiving end.

Received Signal Strength (RSS) BLE Transmitter and Receiver

Source: https://pcng.medium.com/received-signal-strength-rss-8a306b12d520

Smartphone operating systems like Android, offer convenient access to the Received Signal Strength (RSS) value when the smartphone receives a Bluetooth Low Energy (BLE) packet. By utilizing the Android. Bluetooth SDK, we can retrieve this value through the RSSI variable.

The RSS values can provide valuable insights about the BLE transmitter. One practical application is estimating the distance between our smartphone and the BLE transmitter. We can collect the RSS values at various distances and employ curve-fitting methods to create a ranging model. Alternatively, a simple machine learning approach, such as linear regression, can be applied to learn the ranging model.

Conclusion:

WiFi offloading optimizes connectivity by diverting data traffic to WiFi networks. It offers benefits such as enhanced connectivity, cost savings, reduced network congestion, and improved battery life. As data demands increase, WiFi offloading proves valuable in providing seamless connectivity and addressing network limitations. WiFi offloading works by using network selection algorithms to determine the best connection and ensure seamless handover between cellular and WiFi networks. The Received Signal Strength (RSS) plays a crucial role in this process, providing information about the strength of the received signal.