You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello JdeRobot Mentors,
My name is Ujjwal, and I'm wrapping up my undergrad coursework right now.
To kick off my integration into the project and highlight my grasp of the JdeRobot HAL/GUI setup, I tackled the essential Follow Line challenge by launching the RADI Docker environment straight from my Ubuntu setup.
For the technical side, I leaned on OpenCV to shift the live camera stream over to HSV space, where I layered in some combined masks to pinpoint and pull out the red line—handling all sorts of lighting variations along the way. Once that was locked in, I pulled the line's central position right from the image's moments using cv2.moments, keeping things adaptive frame by frame.
From there, I figured out the deviation from the ideal path (that's the cross-track error) and built out a calibrated PD controller to generate those exact turning rates via HAL.setW, which kept the robot cruising ahead steadily, no shaky deviations in sight.
To top it off, I wove in a quick safety net with HAL.setV(0.0) that kicks in smoothly to pause the forward drive whenever the line dips out of view even briefly, avoiding any mishaps.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello JdeRobot Mentors,
My name is Ujjwal, and I'm wrapping up my undergrad coursework right now.
To kick off my integration into the project and highlight my grasp of the JdeRobot HAL/GUI setup, I tackled the essential Follow Line challenge by launching the RADI Docker environment straight from my Ubuntu setup.
For the technical side, I leaned on OpenCV to shift the live camera stream over to HSV space, where I layered in some combined masks to pinpoint and pull out the red line—handling all sorts of lighting variations along the way. Once that was locked in, I pulled the line's central position right from the image's moments using cv2.moments, keeping things adaptive frame by frame.
From there, I figured out the deviation from the ideal path (that's the cross-track error) and built out a calibrated PD controller to generate those exact turning rates via HAL.setW, which kept the robot cruising ahead steadily, no shaky deviations in sight.
To top it off, I wove in a quick safety net with HAL.setV(0.0) that kicks in smoothly to pause the forward drive whenever the line dips out of view even briefly, avoiding any mishaps.
Screencast.from.03-06-2026.124747.AM.webm
Beta Was this translation helpful? Give feedback.
All reactions