Students are provided with a rigorous background in human "sensors" (including information …
Students are provided with a rigorous background in human "sensors" (including information on the main five senses, sensor anatomies, and nervous system process) and their engineering equivalents, setting the stage for three associated activities involving sound sensors on LEGO® robots. As they learn how robots receive input from sensors, transmit signals and make decisions about how to move, students reinforce their understanding of the human body's sensory process.
Four lessons related to robots and people present students with life sciences …
Four lessons related to robots and people present students with life sciences concepts related to the human body (including brain, nervous systems and muscles), introduced through engineering devices and subjects (including computers, actuators, electricity and sensors), via hands-on LEGO® robot activities. Students learn what a robot is and how it works, and then the similarities and differences between humans and robots. For instance, in lesson 3 and its activity, the human parts involved in moving and walking are compared with the corresponding robot components so students see various engineering concepts at work in the functioning of the human body. This helps them to see the human body as a system, that is, from the perspective of an engineer. Students learn how movement results from 1) decision making, such as deciding to walk and move, and 2) implementation by conveying decisions to muscles (human) or motors (robot).
This activity helps students understand the significance of programming and also how …
This activity helps students understand the significance of programming and also how the LEGO MINDSTORMS(TM) NXT robot's sensors assist its movement and make programming easier. Students compare human senses to robot sensors, describing similarities and differences.
Students' understanding of how robotic touch sensors work is reinforced through a …
Students' understanding of how robotic touch sensors work is reinforced through a hands-on design challenge involving LEGO MINDSTORMS(TM) NXT intelligent bricks, motors and touch sensors. They learn programming skills and logic design in parallel as they program robot computers to play sounds and rotate a wheel when a touch sensor is pressed, and then produce different responses if a different touch sensor is activated. Students see first-hand how robots can take input from sensors and use it to make decisions to move as programmed, including simultaneously moving a motor and playing music. A PowerPoint® presentation and pre/post quizzes are provided.
This lesson describes the function and components of the human nervous system. …
This lesson describes the function and components of the human nervous system. It helps students understand the purpose of our brain, spinal cord, nerves and the five senses. How the nervous system is affected during spaceflight is also discussed in this lesson.
Students learn about the human body's system components, specifically its sensory systems, …
Students learn about the human body's system components, specifically its sensory systems, nervous system and brain, while comparing them to robot system components, such as sensors and computers. The unit's life sciences-to-engineering comparison is accomplished through three lessons and five activities. The important framework of "stimulus-sensor-coordinator-effector-response" is introduced to show how it improves our understanding the cause-effect relationships of both systems. This framework reinforces the theme of the human body as a system from the perspective of an engineer. This unit is the second of a series, intended to follow the Humans Are Like Robots unit.
Vision is the primary sense of many animals and much is known …
Vision is the primary sense of many animals and much is known about how vision is processed in the mammalian nervous system. One distinct property of the primary visual cortex is a highly organized pattern of sensitivity to location and orientation of objects in the visual field. But how did we learn this? An important tool is the ability to design experiments to map out the structure and response of a system such as vision. In this activity, students learn about the visual system and then conduct a model experiment to map the visual field response of a Panoptes robot. (In Greek mythology, Argus Panoptes was the "all-seeing" watchman giant with 100 eyes.) A simple activity modification enables a true black box experiment, in which students do not directly observe how the visual system is configured, and must match the input to the output in order to reconstruct the unseen system inside the box.
Students observe and test their reflexes, including the (involuntary) pupillary response and …
Students observe and test their reflexes, including the (involuntary) pupillary response and (voluntary) reaction times using their dominant and non-dominant hands, as a way to further explore how reflexes occur in humans. They gain insights into how our bodies react to stimuli, and how some reactions and body movements are controlled automatically, without conscious thought. Using information from the associated lesson about how robots react to situations, including the stimulus-to-response framework, students see how engineers use human reflexes as examples for controls for robots.
Students learn about human reflexes, how our bodies react to stimuli and …
Students learn about human reflexes, how our bodies react to stimuli and how some body reactions and movements are controlled automatically, without thinking consciously about the movement or responses. In the associated activity, students explore how reflexes work in the human body by observing an involuntary human reflex and testing their own reaction times using dominant and non-dominant hands. Once students understand the stimulus-to-response framework components as a way to describe human reflexes and reactions in certain situations, they connect this knowledge to how robots can be programmed to conduct similar reactions.
Students continue to build a rigorous background in human sensors and their …
Students continue to build a rigorous background in human sensors and their engineering equivalents by learning about electronic touch, light, sound and ultrasonic sensors that measure physical quantities somewhat like eyes, ears and skin. Specifically, they learn about microphones as one example of sound sensors, how sounds differ (intensity, pitch) and the components of sound waves (wavelength, period, frequency, amplitude). Using microphones connected to computers running (free) Audacity® software, student teams experiment with machine-generated sounds and their own voices and observe the resulting sound waves on the screen, helping them to understand that sounds are waves. Students take pre/post quizzes, complete a worksheet and watch two short online videos about "seeing" sound.
Echolocation is the ability to orient by transmitting sound and receiving echoes …
Echolocation is the ability to orient by transmitting sound and receiving echoes from objects in the environment. As a result of a Marco-Polo type activity and subsequent lesson, students learn basic concepts of echolocation. They use these concepts to understand how dolphins use echolocation to locate prey, escape predators, navigate their environment, such as avoiding gillnets set by commercial fishing vessels. Students will also learn that dolphin sounds are vibrations created by vocal organs, and that sound is a type of wave or signal that carries energy and information especially in the dolphin's case. Students will learn that a dolphin's sense of hearing is highly enhanced and better than that of human hearing. Students will also be introduced to the concept of by-catch Students will learn what happens to animals caught through by-catch and why.
Why do humans have two ears? How do the properties of sound …
Why do humans have two ears? How do the properties of sound help with directional hearing? Students learn about directional hearing and how our brains determine the direction of sounds by the difference in time between arrival of sound waves at our right and left ears. Student pairs use experimental set-ups that include the headset portions of stethoscopes to investigate directional hearing by testing each other's ability to identify the direction from which sounds originate.
With the challenge to program computers to mimic the human reaction after …
With the challenge to program computers to mimic the human reaction after touching a hot object, students program LEGO® robots to "react" and move back quickly once their touch sensors bump into something. By relating human senses to electronic sensors used in robots, students see the similarities between the human brain and its engineering counterpart, the computer, and come to better understand the functioning of sensors in both applications. They apply an understanding of the human "stimulus-sensor-coordinator-effector-response" framework to logically understand human and robot actions.
Paul Andersen takes you on a tour of the cell. He starts …
Paul Andersen takes you on a tour of the cell. He starts by explaining the difference between prokaryotic and eukaryotic cells. He also explains why cells are small but not infinitely small. He also explains how the organelles work together in a similar fashion.
This activity helps students understand how a LEGO MINDSTORMS(TM) NXT robot moves …
This activity helps students understand how a LEGO MINDSTORMS(TM) NXT robot moves using motors and wheels. Then students relate the concepts of decision-making actuation and motion in humans to their parallels in mechanized robots, and understand the common themes associated with movement.
Students gain a rigorous background in the primary human "sensors," as preparation …
Students gain a rigorous background in the primary human "sensors," as preparation for comparing them to some electronic equivalents in the associated activity. A review of human vision, hearing, smell, taste and touch, including the anatomies and operational principles, is delivered through a PowerPoint® presentation. Students learn the concept of "stimulus-sensor-coordinator-effector-response" to describe the human and electronic sensory processes. Student pairs use blindfolds, paper towels and small candies in a taste/smell sensory exercise. They take pre/post quizzes and watch two short online videos. Concepts are further strengthened by conducting the associated activity the following day, during which they learn about electronic touch, light, sound and ultrasonic sensors and then "see" sound waves while using microphones connected to computers running (free) Audacity® software.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.