Working with LiDAR companies in California with my colleague Majid Ebnali Heidari,Ph.D., It's actually mind-blowing to witness the pace of innovation being achieved in the ADAS industry. With so many key players in the industry, how do these organizations keep up with the extreme pace of innovation to keep up in the competitive landscape? How can they accelerate product development cycles without sacrificing validation? How can they get to market quickly without sacrificing safety standards? While we stand on the cusp of a technological revolution in supercomputing and AI the answer is becoming much clearer, Simulation. Here’s how Ansys comprehensive simulation solutions are supporting these cutting-edge developments : 📈 LiDAR Hardware Design: - Photonic Design: Achieve superior performance with optimized photonic components. - Optical Design Optimization & Tolerancing: Ensure accuracy and reliability in your optical systems. - Optomechanical Packaging: Integrate optical and mechanical components seamlessly. - Stray Light Analysis: Minimize unwanted light and improve sensor accuracy. - Structural & Thermal Analysis: Validate and enhance the durability and performance of your designs. ⚙️ Simulated System Performance: - Time-of-Flight & System Efficiency: Optimize LiDAR system efficiency and accuracy. - Environment Integration & System Impact: Simulate real-world conditions to validate system robustness. 🚗 Sensor-to-Vehicle Integration: - LiDAR-Vehicle Placement Optimization: Ensure optimal sensor placement for maximum coverage and performance. - Advanced Scenario Validation: Test and validate your LiDAR systems in complex driving scenarios through virtual prototype driving simulation. 💡 Compliance and Beyond: - Achieve ISO 26262 and 21434 compliance effortlessly. - Consolidate your simulation tech stack with Ansys, covering all your engineering needs under one roof. Through simulation, LiDAR companies can reduce development time, enhance product reliability, and bring innovative solutions to market faster. With our NVidia partnership, Ansys simulations are only becoming faster and more powerful. If you're interested in a deeper dive, shoot me a DM and we can share how we can support your LiDAR projects. #LiDAR #Simulation #ADAS #AutonomousVehicles #EngineeringExcellence #Ansys #Optics #Photonics
Jake Mendez’s Post
More Relevant Posts
-
📢 𝙉𝙚𝙬 𝙋𝙖𝙥𝙚𝙧 𝙥𝙪𝙗𝙡𝙞𝙨𝙝𝙚𝙙 📢 🚀 Excited to announce the publication of our latest paper titled "LiMOX - A Point Cloud Lidar Model Toolbox Based on NVIDIA OptiX Ray Tracing Engine"! 📝 We are thrilled to share that our team at Virtual Vehicle Research GmbH, in collaboration with JOANNEUM RESEARCH DIGITAL – Digital Twin Lab and Infineon Technologies, has developed 𝐋𝐢𝐌𝐎𝐗, 𝐚 𝐠𝐫𝐨𝐮𝐧𝐝𝐛𝐫𝐞𝐚𝐤𝐢𝐧𝐠 𝐩𝐨𝐢𝐧𝐭 𝐜𝐥𝐨𝐮𝐝 𝐥𝐢𝐝𝐚𝐫 𝐦𝐨𝐝𝐞𝐥 𝐭𝐨𝐨𝐥𝐛𝐨𝐱. 🔎 LiMOX utilizes the power of ray tracing and the NVIDIA OptiX engine to generate precise point clouds according to material classes. The reflectivity-driven range handling, using infrared material measurements, has a high impact on the generated point clouds. The simulation can be used stand-alone or in modular co-simulation applications via Open-Simulation-Interface (OSI). 📚 Published in MDPI, this paper represents a 𝐬𝐢𝐠𝐧𝐢𝐟𝐢𝐜𝐚𝐧𝐭 𝐦𝐢𝐥𝐞𝐬𝐭𝐨𝐧𝐞 in advancing lidar simulation capabilities, paving the way for more accurate and realistic virtual environments. 👏 Huge thanks to our dedicated Authors Relindis Rott, David Ritter, Oliver Nikolic, Stefan Ladstätter, and Marcus Hennecke for their invaluable contributions! ❗ Read the full paper here or use the QR-Code in the picture: https://lnkd.in/d4agv3aM #LiMOX #lidar #simulation #raytracing #NVIDIAOptiX #MDPI #research #automatedvehicle #AD #publication
To view or add a comment, sign in
-
LiDAR, Medical Imaging, Visual Inspection, Deep Learning, Generative AI, and More! Product Application Engineer at The MathWorks
Thrilled to share my poster "3D Vehicle Detection using Flash Lidar Imagery" from the NVIDIA GTC conference! 🚀 The presentation was a fantastic success, sparking engaging discussions on the forefront of Lidar technology and deep learning. Our poster highlights the innovative use of MATLAB and the Lidar Toolbox for designing a cutting-edge 3D Vehicle Detection system. From preprocessing and labeling data with precision, training and testing the YOLOX model for accuracy, to augmenting the model with tracking for robustness - each step was meticulously crafted. We then converted detection and tracking results into 3D, showcasing the real potential of Lidar in understanding the world around us. The final achievement? Deploying this sophisticated system on an NVIDIA Jetson Orin Nano board, demonstrating the feasibility of running advanced AI models on edge devices in real-time. 🖥️ #NVIDIAGTC #Lidar #DeepLearning #MATLAB #YOLOX #NVIDIAJetson #EdgeComputing #3DVehicleDetection #LidarToolbox Link to the publication: https://lnkd.in/e-vqKtTv
To view or add a comment, sign in
-
🌟 Friday = an exciting new weekend read! 🌟 This week, we're spotlighting the innovative work of Scantinel Photonics, a global leader in LiDAR sensor technology. Scantinel Photonics has reinforced its position with the introduction of its next-generation Photonic Single Chip, based on standard CMOS technology. This new PIC features a fully integrated, massively parallel detector system for coherent LiDAR. Recently tested at Scantinel, the chip demonstrated a significant improvement in signal-to-noise ratio, about 20dB better than previous solid-state LiDAR scanners. Designed for automotive LiDAR applications, the scanner-detector chip is a fully integrated, automotive-ready device. It includes a photonic chip and a low-noise electronics board. With enhanced SNR, the system has achieved a tenfold reduction in LiDAR power consumption, enabling faster pixel rates. Unlike market systems using proprietary technology or two-mirror scanners, this generation leverages the full advantages of FMCW technology over existing TOF LiDAR systems. The PIC production has been fully transferred to high-volume standard CMOS fabrication, showcasing the advanced maturity of Scantinel’s technology. For more details, click here: https://bit.ly/45YSfB7 #ScantinelPhotonics #LiDAR #PhotonicChips #Innovation #TechNews #AutomotiveTech
To view or add a comment, sign in
-
🌟 Sneak Peek Friday: Research Focus Episode! 📚🔬 In this week's episode of Sneak Peek Fridays, we are excited to highlight a significant milestone in our research efforts. Our co-founder, Dr. Philipp Rosenberger, has contributed to a newly published journal paper titled "Introducing the Double Validation Metric for Radar Sensor Models". This publication is a testament to our ongoing commitment to advancing sensor model validation techniques. The publication is a collaborative effort with our esteemed research partners from Institute of Automotive Engineering, Technische Universität Darmstadt (FZD) and we are proud to announce that it is available as Open Access. This means that everyone can benefit from the insights and findings presented in this groundbreaking work. The paper tackles our core topic at Persival: Credible Perception Sensor Simulation. While simulation models for perception sensors such as lidar, radar, and cameras exist at varying levels of detail, validating their accuracy remains an ongoing research challenge. The authors, led by Lukas Elster, explain the Double Validation Metric (DVM), previously applied to lidar sensor models, and extend it to radar sensor models through the DVM Map introduced here. This new method, demonstrated on real and simulated radar sensor data, provides detailed and accurate validation, reveals previously undetected simulation errors, and offers a more intuitive visualization of results using satellite imagery. The paper is published in Automotive and Engine Technology, a fully open access journal covering all aspects of automotive and commercial vehicle engineering and engine development. The journal provides in-depth articles by expert authors from academia and industry, and serves as an essential source of information for a global audience of automotive engineers. For more information and to read the full paper, visit the following link: https://lnkd.in/eQSTDNqW Join us in exploring the future of simulation quality and enhancing our understanding of perception sensor simulation. Stay tuned for more updates to bring you the latest advancements every Friday! #SneakPeekFridays #ResearchFocus #PersivalGmbH #TUDarmstadt #Collaboration #Innovation #AutomotiveEngineering #Radar #Perception #Sensor #Simulation and #Model #Validation
To view or add a comment, sign in
-
🔬 𝗦𝗶𝗺𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗼𝗳 𝘁𝗵𝗲 𝗪𝗲𝗲𝗸 🔬 📡 Silicon photonics for LiDAR is an exciting area of technology that leverages the advantages of silicon-based optical components to enhance the performance and efficiency of LiDAR systems. It is a crucial technology shaping the development of next-generation self-driving cars. Significant research in this area has focused on long waveguide grating antennas (WGAs). These antennas are crafted by introducing periodic perturbations in an integrated waveguide. They have emerged as a favored option for achieving expansive apertures because they can extend lengthwise to several millimeters. This week, we highlight an example of WGA modeling. Using a dual-layer design, a unidirectional emission pattern is achieved via destructive interference of the downward emission and constructive interference of the upward emission. The far-field emission pattern can be obtained with a low computational cost by modeling only a small area around the antenna and performing near-field to far-field projection. See the original publication from Prof. Michael Watts's group here: https://lnkd.in/gUKimnTD See the full Python notebook implementation of the model here: https://lnkd.in/gcUKgREE #LiDAR #SiliconPhotonics #IntegratedPhotonics #Tidy3D #FDTD #Flexcompute
To view or add a comment, sign in
-
This is not a photo--this is the Ansys Perceive EM solver operating in NVIDIA Omniverse. Perceive EM is a real-time EM solver, running on NVIDIA GPUs, generating radar and wireless channel models (I/Q complex data--all channels!) to provide radar/comms channel data on demand through an efficient, lightweight solver-on-demand API. The color plot you see is showing Range-Doppler data for a 77 GHz radar on the car (antenna models by Ansys HFSS!). Vertical axis is range, horizontal is velocity (Doppler). Watch the micro-Doppler returns from the pedestrian's arms and legs partway through the radar demo--it's stunning. And the simulation runs as fast (or faster) than a real radar would operate. This technology brings an entirely new game to wireless, RF and radar propagation modeling, thanks to NVIDIA GPU technology and Ansys physics simulation software. #ansys #nvidia #5G #radar #6G #omniverse
We’re collaborating with NVIDIA to bring real-time photorealistic rendering and accurate EM/RF simulation into digital twins. The Ansys Perceive EM Solver is specifically tailored for RF and EM domains, and through its seamless API integration with NVIDIA Omniverse, users can simultaneously evaluate multiple domains, including Radar, signal bandwidth, camera systems, and lidar, all within the same simulator. This ability to collect immediate feedback on performance and functionality will enable engineers to become even more efficient with projects such as optimizing layouts for telecom systems, designing radar sensors, and much more. Want to learn more? Visit us this week at booth 830 at #GTC24 for a live demo of downtown San Jose.
Ansys Perceive EM Demo: Simulation with APIs in Omniverse
To view or add a comment, sign in
-
I am delighted to show you an example of Exwayz SLAM CLI usage! 🚀 exwayz_slam is the first executable of our #3d mapping software suite 👨💻 Its goal is to process raw LiDAR data captured in native #PCAP format or #ROS bag format: it computes what we call the LiDAR #odometry, which is the relative trajectory of the sensor from its first position in the dataset. On the video below, this trajectory is displayed in white. Thanks to this estimated trajectory, a point cloud #map is simulatenously built (in viridis colormap below) by aggregating the LiDAR frames on the corresponding trajectory points. Basically, a good SLAM produces a locally sharp point cloud, as it is the case with ours of course 😃 The video below is not accelerated: on my computer, a laptop with an Intel core i7-12700U, exwayz_slam processes data at ~120Hz whereas the acquisition frame rate was 20Hz, which means that the processing is 6 times faster than the data collection ⏩ Usefull when you have several ours of data to process! The other Exwayz executables from Exwayz #3DM allow to compensate for the drift with our loop closure solution, and to do fine georeferencing. I'll dedicate a post to them one day, stay tuned 🔥
To view or add a comment, sign in
-
Exciting News: Our Latest Research on MIMO Arrays Published! We are thrilled to announce the publication of our latest research article, "Multi-Objective Design and Optimization of Hardware-Friendly Grid-Based Sparse MIMO Arrays," in the journal Sensors. This groundbreaking study dives into the design and optimization of hardware-efficient grid-based sparse MIMO arrays, offering innovative solutions for multi-target detection in radar systems. Our research proposes a novel framework that minimizes hardware complexity while maximizing performance, making it a game-changer for practical applications in fields like automotive radar and advanced communication systems. By leveraging cutting-edge techniques such as desirability functions and machine learning-based optimization, we have opened new avenues for the development of more efficient and adaptable radar systems. In future work, we plan to extend these advancements to other innovative solutions in sensor engineering, including in-cabin child presence detection through heart rate measurements using radar technology. Additionally, we aim to explore applications in other sensors such as Lidar, Time-of-Flight (ToF) sensors, RGB cameras, and IMUs. This will further broaden the applicability of our framework, enabling optimized sensor fusion for more comprehensive and robust detection systems. We invite you to read the full article and explore how this work can contribute to advancements in radar and sensor technologies. Thank you, and stay tuned for more updates as we continue our work in this exciting field of sensor engineering! https://lnkd.in/g-Zf9UeJ
To view or add a comment, sign in
-
Exciting News: Our Latest Research on MIMO Arrays Published! We are thrilled to announce the publication of our latest research article, "Multi-Objective Design and Optimization of Hardware-Friendly Grid-Based Sparse MIMO Arrays," in the journal Sensors. This groundbreaking study dives into the design and optimization of hardware-efficient grid-based sparse MIMO arrays, offering innovative solutions for multi-target detection in radar systems. Our research proposes a novel framework that minimizes hardware complexity while maximizing performance, making it a game-changer for practical applications in fields like automotive radar and advanced communication systems. By leveraging cutting-edge techniques such as desirability functions and machine learning-based optimization, we have opened new avenues for the development of more efficient and adaptable radar systems. In future work, we plan to extend these advancements to other innovative solutions in sensor engineering, including in-cabin child presence detection through heart rate measurements using radar technology. Additionally, we aim to explore applications in other sensors such as Lidar, Time-of-Flight (ToF) sensors, RGB cameras, and IMUs. This will further broaden the applicability of our framework, enabling optimized sensor fusion for more comprehensive and robust detection systems. We invite you to read the full article and explore how this work can contribute to advancements in radar and sensor technologies. Thank you, and stay tuned for more updates as we continue our work in this exciting field of sensor engineering! https://lnkd.in/g-Zf9UeJ
To view or add a comment, sign in
-
Join our live demo tomorrow, 6 March, 11:00 CET! Learn how we used Synopsys OptoCompiler advanced python-based design automation capabilities to achieve the desired optical phase difference requirements of a LiDAR phase array while managing output power across the emitter array. Register: https://ow.ly/jHQB50QJ02N
Synopsys Photonics IC Development Platform Live Demo (EU)
events.zoom.us
To view or add a comment, sign in