Space University Knowledge Center

No itemsNo items

Featured

Robotics

  • 5 Steps To Create An Artificial Intelligence That Will Manage Your Agricultural Robotics

     

    When people think of agriculture, they think of a farmer leading a horse and plow through a field. However, if you take a closer look at the way this industry is operating now, you won’t believe your eyes.

    View of corn fields and farms in Southern York County, Pennsylvania.five ideas for implementing AI that manages agricultural robotics

    Artificial intelligence (AI) is gaining a foothold in agriculture. Many companies plan to equip their fleet of agricultural machinery with AI-controlled robotics in the near future. According to an IDTechEx report, 80 percent of companies plan to introduce autonomous machines like tractors, small agricultural robots, mobile dairy farm robots, and drones for harvesting crops.

    Agricultural enterprises are on a constant quest to increase production, and implementing AI-powered agricultural robotics is the best solution.

    But where to start? What should be done to keep up with this rapidly developing technology?

    Here are five ideas for implementing AI that manages agricultural robotics.

    1. A Clean Bill of Soil Health

    The condition of the soil that will be used for harvesting should be tested to define its health and quality. This analysis of soil conditions will allow farm operators to choose organic fertilizers that will improve the soil’s ability to transmit water and air.

    AI-based solutions provide an insightful analysis of soil samples and give you actionable results. Trace Genomics, a California-based company, has developed an AI-based system that performs a DNA analysis of soil samples using specific tools and robotics. You only need to provide a small sample of soil to obtain a full DNA analysis.

    The process of harvesting always begins with a field test to prevent defective crops. The soil DNA analysis includes pathogen screening and complete sets of data on soil health. These tests must be done regularly during pre-planting and post-harvesting.

    2. Shaking off Disease

    Planting is one thing, but what about protecting what you’re growing? Threats like deforestation and soil dehydration can impact the quality of crops, leading to various diseases, the consequences of which can cost millions of dollars.

    AI-based solutions can prevent this from happening by screening for and preventing disease outbreaks. For instance, the revolutionary Blue River Technology has created a range of weed-control robots dubbed “See & Spray” that use computer vision and machine learning.

     

     

    See & Spray machine in action. Video credit: YouTube / Blue River Technology.

    The machine detects weakened crops that have been brought down by weeds and sprays them. The great thing about this technology is that instead of covering the entire field with pesticides, the robot only sprays affected crops.

    3. Forewarned, Forearmed

    What if your crops are affected by a disease? AI-powered apps that enable robotics to detect weakened crops are a perfect solution.

    Plantix offers an app that aims at improving profitability by performing a health check of the crops to provide full disease control. The biggest advantage of this app is that it boasts a complete library of plant diseases, which helps in detecting the problem and quickly coming up with a solution.

     

    Video credit: YouTube / Plantix.

    Apps like Plantix use machine learning to allow robots to operate according to a specific function, such as detecting weakened plants. When the robot detects a potentially disease-ridden plant, it archives all the relevant information as well as data on how the farmer can rid the crops of the disease and prevent it from happening again.

    4. Will It Rain? Will It Shine?

    Weather affects the health and performance of crops. Unfortunately, it’s not always possible to predict the weather with existing standard tools.

    It’s getting harder to predict the weather for agricultural purposes because of the drastic effects of climate change. “Changes in temperature, as well as the growth of atmospheric carbon dioxide, have a significant effect on weather, and on agriculture consequently,” says researcher Martin Heuter.

    AI solutions allow farmers to predict weather and analyze crop sustainability. For instance, aWhere, a system based on a machine-learning algorithm, uses a satellite connection to help you predict the weather. Such AI-powered technology is very effective and accurate.

    5. The Fruits of AI Labor

    Lastly, let’s talk about the AI-based solutions for harvesting. Harvesting is known to be quite labor-intensive. However, with AI-powered robotics, it doesn’t have to be this way.

    Harvest CROO Robotics, a company established in 2013, went to market with a robot that can harvest strawberries and pack them. It can harvest up to 8 acres a day and does the work of approximately 30 humans. This AI-powered solution is a great way to maintain high production and save considerable amounts of money.

    The Bottom Line

    Implementing AI that manages agricultural robotics is a gradual and time-consuming process, but one that will definitely bring many benefits. Hopefully, this guide will help agricultural businesses consider how they can make use of AI in the future.

    Read more »
  • The Cobot Experience: Changliu Liu & The Difference Between Technology and Fantasy

    Carnegie Mellon's Changliu Liu talks cobot safety, the importance of having realistic expectations of what cobots can and can't do and her vision of a manufacturing world after cobots. 

     Changliu Liu

    Changliu Liu is an assistant professor at Carnegie Mellon University's prestigious Intelligent Control Lab.  Her research interests include human-robot interaction (HRI) in manufacturing environments, cobot safety and enhancing human-robot collaboration (HRC) through control and planning of robot behavior and learning and prediction of human behaviors.  With a special focus on HRI and cobot safety in manufacturing, Liu also envisions and is helping to build a future beyond cobots.

    Liu kindly agreed to be interviewed about her research and the importance of having realistic expectations of collaborative robots.

    The Interview

    EC:  Humans and robots interact in all sorts of scenarios, from elder care to surgery and bomb disposal. With so many opportunities for researchers, what attracted you to the study of human-robot interaction in manufacturing?

    CL: From my earliest college studies in mechanical engineering, I've been fascinated by the idea of robots with human-like intelligence that can function as true colleagues. Developing such robots involves understanding how humans and robots interact with each other.

    The best place to study these interactions is in industry and manufacturing, where automation has enjoyed the most popularity and success.

    I'm also very interested to see how technology and society will evolve following the introduction of robots with human-like intelligence.

    EC:  'Human-like intelligence' seems a world away from today's industrial robots and cobots...

    CL: Currently, we are not worried that robots are too smart. We are worried that robots are not smart at all. [Laughs.]

    EC: True.  [Laughs.]  But aren't the limited capabilities of cobots (e.g. around payload and speed and force) what makes them safe to work with?

    CL: In terms of current technology, yes.  But in terms of their future capabilities, no.  Today's cobots are safe.  But to make them safe, they are designed not to be as capable or fast as traditional industrial robots.  Understanding this trade-off between functionality and safety is key to having a realistic understanding of cobots. 

    EC:  What makes for a realistic understanding of cobots?

    CL: One of our goals as a research group is to make sure that people are not over optimistic about what cobots can do. End users shouldn't view cobots as 'superhumans.'  At the same time, people should not be afraid of cobots in terms of personal safety risk.

    EC: Are end users generally best served by viewing their cobot as a colleague, a tool, a form of prosthesis, or some other category?

    CL:  I think it's best for end-users to see cobots as something between a colleague and a tool. Cobots are definitely more colleague-like than traditional robots, which are typically seen as tools, but cobots are not colleagues to the same level as a human. So, something in-between.

    EC:  What can be done to build realism and trust between humans and cobots?

    CL:  The first step is training and education. End users need to understand how the robot works. Humans workers need to have a realistic understanding of the limitations and capabilities of cobots, so that they understand what the cobot can and can't do.

    The second step is to get on and work with the cobot for a while before reaching judgment. In this context, cobots can be thought of as tools. You only get used to any tool after playing with it multiple times. If you don't spend time gaining firsthand experience with it, I don't think trust will develop.

    ****

    Here, Changliu Liu is training and testing a safety controller for cobots...  

     

    ****

    EC: Trust and maintaining a sense of personal safety go together. How do you approach cobot safety issues?

    CL: We divide end-user protection into two main stages. The first is 'interactive safety.' This is the stage where we aim to prevent a collision from happening. In this stage, cobots should be smart enough to correctly perceive human motion and smartly plan and control its own motion accordingly.

    FANUC's CR series of cobots, for example, come with force and speed limiting. An additional software package allows human integrators and end users to specify a soft or virtual fence for the cobot. This invisible, software-defined fence allows you to limit the motion of the cobot, so you can instruct it to slow down when a human enters the area.

    If the first stage of end user safety fails and a collision between human and robot occurs, we enter the second stage ‘intrinsic safety’.  Here it is crucial to ensure that limited force is applied to the human --certainly not enough to create any risk of severe injury.

    So, cobots are safe, but all types of safety need to be defined in certain contexts and the application is key.

    EC: A cobot handling a sharp blade, for example, presents a greater risk than a cobot that's handling small plastic parts?

    CL: Exactly.

    EC: How important are ISO guidelines and standards when it comes to cobot safety?

    CL: They are very important because they specify exactly how much contact speed and force is allowed when there is a collision between a human and a robot in different scenarios.  Standards help cobot makers and provide end users with a sense of security. Cobot manufacturers are doing very well in getting their robots to satisfy ISO standards.

    EC: What advice do you have for end users, especially those that have not experienced a cobot up close before?

    CL: To really get a good understanding of cobots, you have to spend time with them. There are opportunities for enthusiastic end users to gain hands-on experience of cobots outside of manufacturing facilities too, through cobot makers and integrators offering participation in human subject tests and through university and technical college open houses and demos.

    ****

    In our second video from the Intelligent Control Lab, a cobot yields and then quickly returns to collaborative behavior as a human behaves somewhat randomly in its vicinity... 

     

    ****

    EC: When asked if there was one thing that could send robotics in a completely new direction, Universal Robots co-founder and CTO Esben Østergaard replied “liquid steel.”  You also envision a world far beyond today's cobots, with your work focusing on the software side, on the “human-like intelligence” that will enable enhanced collaboration between manufacturing workers and robots in future.  What's going on with cobot advocates?!

    CL: I think cobots will do really well in manufacturing over the next few decades, but I don't think cobots, like the ones we find in factories today, are the ultimate end-goal for industrial automation. When robots eventually get more intelligent they will be able to handle all the jobs without collaborating with humans. Cobots will prove to be a great solution for the next 20 or 30 years, until more versatile robots can take over.

    EC: 20-30 years is a long time in terms of manufacturing technologies though, isn't it?

    CL: Yes.  And during those years, cobots will learn more from their human colleagues until automation finally takes over.

    EC: How are you and your colleagues helping to design cobots with human-like intelligence?

    CL: We're looking at ways to make cobots autonomous and human-like so they can be like a real colleague to human workers.

    To make this happen we need cobots to have a high level of intelligence; not just the intelligence to slow down and stop when a human gets too close, but the intelligence to even more effectively interact and collaborate with humans on different industrial tasks.

    So, we need a lot of data, especially data about normal human behavior in factories. We work a lot on trying to predict human behavior and safely controlling robot motion based on those predictions, using deep-learning methods.

    To build models of human behaviors using a data driven approach, we ask human subjects to come to the lab and perform the different types of motions that they would perform in production lines. We record those motions and use deep-learning methods to learn the behavior pattern. We then apply this model in real time prediction and motion control.

    EC: You're making today's cobots sound quite safe and boring by comparison.

    CL: That’s how technology differs from fantasy!

    (Note: The interview was edited for length and clarity. It was conducted for educational purposes and the views expressed therein are those of the expert and do not necessarily reflect those of Robotiq. )

    Read more »
  • What's New In Robotics?  12.07.2019

    Good morning.  In this week's news mix: ABB announces pharma cobots project, researchers unveil prototype cobot safety system and India's robotics growth driven by SMEs, says UR.  We also admire Moxi's social skills, meet a signing robot, marvel at two new space bots, and much more!

    Cobots & manufacturing

     ABB announced this week that its YuMi collaborative robots are set to be installed in the Texas Medical Center in Houston, where they will be used to support the facility's health-care laboratory services. 

    Advanced_collaborative_robotics_for_medical_laboratories
    Credit:  ABB

    Via Barron's:

    “Right now, laboratory automation is hard automation—very specific machines doing very specific tasks,” Marc Segura, ABB’s global head of service robotics, said. “What the lab needs now is flexibility—the lab is becoming increasingly complex."

    Researchers have demonstrated a prototype system that not only slows cobots down when the operator comes closer, but can also orient the cobot's elbow away from an operator while still performing its job...


    "Exponential" growth in robotics adoption in India is being driven by small and medium sized business rather than large companies, Pradeep David, general manager (South Asia) of Universal Robots, told News 18 this week:

    "In the past three years, we’ve grown in usage and implementation exponentially. When we first came in, our primary businesses were with the larger companies — for them, they don't really have a choice but use automation in the manufacturing process. But today, I can go as far as saying that industrial automation is growing exponentially in India because SMEs are adopting new tech."

    JH Robotics released new video showcasing its Sidekick CR7, a mobile collaborative machine tending system that incorporates FANUC's CR-7iAL cobot...

     

    Finnish subcontract machine tooling shop Ket-Met has reduced machine hour costs by more than 40% thanks to the introduction of cobots. Via Robotics & Automation News: 

    universal-ket-met-2-copy
    Credit: Universal Robots

    "The cobot moves the workpieces on the lathe or milling machine, waits for their completion, cleans the fasteners and parts with compressed air and transports the finished parts to the washing basket for washing. In addition to servicing machines, cobots are used to assemble parts and grind burrs."

    New video from Integro Technologies shows a cobot performing 20 distinct point inspections using a mounted camera...

     

    In other cobot and manufacturing news: 

    • The next industrial revolution will come from nature (Science|Business)
    • Ford unveils $1 billion investment in Chicago plants (ThomasNet)
    • Intelligent transport systems enhance Human-Robot Collaboration (Design Systems)
    • How high fives help us get in touch with robots (IEEE Spectrum)
    • Spotlight on... robotics (DC Velocity)

     

    Elsewhere...  

     Moxi, a nurse's assistant robot developed by Austin, Texas-based Diligent Robotics has surpassed researchers' expectations in the field, FastCompany reported:

    Capture-2 Credit: Diligent Robotics

    "While Moxi’s job is to take as many mundane tasks as possible off nurses’ plates so that they could spend more time interacting with patients, the Diligent team was surprised to find that patients were fascinated by the robot and wanted to interact with it during their beta trials. Patients ended up being so infatuated with Moxi that they would ask for selfies with the robot; one child even sent Diligent Robotics a letter asking where Moxi lived."

    Scientists from the Universidad Carlos III de Madrid (UC3M), Spain have published a paper about their research into interactions between robots and deaf people, in which they were able to programme the humanoid TEO robot to communicate in sign language.

    205596_web
    Credit: UC3M

    Via EurekAlert!:

    "One of the main new developments of this research is that we united two major areas of Robotics: complex systems (such as robotic hands) and social interaction and communication," explains Juan Víctores, one of the researchers from the Robotics Lab in the Department of Systems Engineering and Automation of the UC3M.

    My first thought?  Great research, and imagine, some years down the line, a robotic prosthesis, controlled by brain-computer interface that can sign for you while leaving your arms free to work on other tasks.  Result? Hands-free sign language.

    In a fascinating interview published in Quanta this week, Hod Lipson from the Creative Machines Lab at Columbia University, spoke about self-awareness in robots:

    “We used to refer to consciousness as ‘the C-word’ in robotics and AI circles, because we’re not allowed to touch that topic,” he said. “It’s too fluffy, nobody knows what it means, and we’re serious people so we’re not going to do that. But as far as I’m concerned, it’s almost one of the big unanswered questions, on par with origin of life and origin of the universe."

    • This Fujian farm robot has a green thumb (MenaFN)
    • Are robots getting better at football? (BBC)
    • MIT robot attempts the #BottleCapChallenge (Mashable)
    • The awesome bowling robot is surely fake (Wired)
    • Big-eyed robots in supermarkets are really annoying people (ZDNet)

    Come back next week for more of the latest robotics news! Until then, please enjoy...


    Five vids for Friday 

     1.  Researchers at EPFL have created a swarm of tiny, reconfigurable robots that can communicate with each other, assign roles among themselves and complete complex tasks together.  (H/T Ars Technica.)


    2.  Technical University of Munich (TUM) researchers have successfully demonstrated a completely automated plane landing system that works without the need for ground-based technologies. (TUM has the details.)


    3.  Engineers at NASA's Jet Propulsion Laboratory in Pasadena, California, have released new video of their four-limbed LEMUR ('Limbed Excursion Mechanical Utility Robot') bot.  LEMUR can scale rock walls using tiny fishhooks in each of its 16 fingers.  (H/T NASA JPL)


    4.  Speaking of space bots, a team of students at from ETH Zurich and ZHAW Zurich released video of their bouncy SpaceBok robot this week.  The bot, which uses a reaction wheel to control its orientation, was developed to "investigate the potential of ‘dynamic walking’ and jumping to get around in low gravity environments." (H/T European Space Agency


    5.  U.S. startup Refraction A.I. unveiled its three-wheeled, autonomous Rev-1 delivery robot this week.   Rev-1 is set for food delivery tests this summer. (H/T Digital Trends)

     

    Read more »

NASA Breaking News

Space News

Universe Today

European Space Agency Articles

  • ESA-Presseeinladung: Themenabend „Take me to the Moon“ im ESOC in Darmstadt am 16. Juli 2019

    “Vorwärts zum Mond” ist die Devise der Europäischen Weltraumorganisation (ESA) im 50. Jubiläumsjahr der ersten bemannten Mondlandung. Warum ist der Erdtrabant so interessant für die Wissenschaft? An welchen Mondprojekten – bemannt und unbemannt – arbeitet die ESA derzeit? Wann stehen wieder Menschen auf dem Mond? Am Dienstag, 16. Juli lädt die ESA zum Themenabend „Take me to the Moon“ ins Satellitenkontrollzentrum (ESOC) in Darmstadt ein. ESA-Experten, darunter die beiden Astronauten Thomas Reiter und Matthias Maurer, blicken dabei zurück auf die Errungenschaften der Apollo-Missionen und erläutern die nächsten Schritte bei der Erforschung des Mondes.  

    Read more »
  • Waldbrand

    Copernicus Sentinel-2 Satelliten nehmen Bilder des Waldbrands in Mecklenburg-Vorpommern auf Read more »
  • Auf dem Weg zum Mars: Haut und Knochen aus dem 3D-Drucker

    Der 3D-Druck von menschlichem Gewebe könnte zur Gesundheit von Astronauten auf dem Weg zum Mars beitragen. Ein ESA-Projekt hat die ersten biogedruckten Haut- und Knochenproben hergestellt.

    Read more »

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>