http://fr.wikipedia.org/wiki/Trois_lois_de_la_robotique
Related: RobotiqueThymio, Python et Asebamedulla - Pierre Boudes #!/usr/bin/python# -*- coding: utf-8 -*-import dbusimport dbus.mainloop.glibimport tempfilethymio = "thymio-II"# first we need the network access to Thymio through DBusdbus.mainloop.glib.DBusGMainLoop(set_as_default=True)bus = dbus.SessionBus() # we use session bus (alternative: system bus)network = dbus.Interface(bus.get_object('ch.epfl.mobots.Aseba', '/'), dbus_interface='ch.epfl.mobots.AsebaNetwork')"""Event listeningthe Thymio has to run a code forwarding interesting local events.Lets write our own aesl source file for that purpose."""with tempfile.NamedTemporaryFile(suffix='.aesl', delete=False) as aesl: aesl.write('<!
Robogator: Your Fearless LEGO Mindstorms Guardian - LEGO Reviews & Videos Robogator is the second construction suggested by the LEGO Mindstorms 8547 Set. It is a robot that imitate a crocodile and has a moving jaw and four legs which are used to patrol and defend a position. Personally I haven’t found this one really fun to build or impressive to watch, but it has been awesome to learn more about NXT-G coding. Tutorials: Official MINDSTORMS NXT 2.0 Bonus Models With the help from the Mindstorms Community Partners (MCP), LEGO has released several bonus models for the NXT 2.0 set. However, they are not easily found on the LEGO website. This page is a collection of pictures and links to the bonus projects. Robot Square does not host the instructions or programs. NXT Segway with Rider NXT Segway with Rider Programming Important Usage Information Note: Unlike balancing robots that use a gyroscopic sensor or other special sensors, this design uses only the color sensor, which does not know which way is "up" in an absolute sense, so it can only guess on its relative tilt based on the amount of reflected light received from the ground. As a result, getting a good balance is a bit of a challenge when you are using it. Please read the following important tips.
Get picture from ErgoJr camera with API (e.g for Snap!) - Support hello @TDTron Today, there is no fast and efficient way to receive the video signal. Therefore, no entry in the API has been added. However you can do it yourself! Insert these few lines into the file poppy_src / pypot / server / snap.py: At line 10, insert:Import cv2 How to use the Ergo Jr camera in Python? - Support Hello!I'm new on the forum, please tell me if my topic exists already.I started this year to use the poppy-ergojr for a project in class.I didnt meet any difficulties in using the motors for all of the moving functionalities, but i need to set the camera as 'dummy' before using the robot. As i would like to use the camera to recognize a face or read a qr code, how or where can i set the camera?I used 'poppy-configure ergo-jr m1' to configure the motors, but dont know how to configure the camera. Thanks for reading