spectrum.ieee.org.rss.xml - sfeed_tests - sfeed tests and RSS and Atom files
 (HTM) git clone git://git.codemadness.org/sfeed_tests
 (DIR) Log
 (DIR) Files
 (DIR) Refs
 (DIR) README
 (DIR) LICENSE
       ---
       spectrum.ieee.org.rss.xml (340224B)
       ---
            1 <?xml version="1.0" encoding="UTF-8"?>
            2 <?xml-stylesheet href="/assets/static/xsl/rss.xsl" type="text/xsl"?><rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/" version="2.0">
            3   <channel>
            4     <title>IEEE Spectrum Recent Content full text</title>
            5     <link>https://spectrum.ieee.org</link>
            6     <description>IEEE Spectrum Recent Content headlines</description>
            7     <pubDate>Fri, 30 Oct 2020 20:25:00 GMT</pubDate>
            8     <atom:link href="https://spectrum.ieee.org/rss/fulltext" type="application/rss+xml" rel="self" />
            9     <item>
           10       <title>Video Friday: Attack of the Hexapod Robots</title>
           11       <link>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-attack-hexapod-robots</link>
           12       <description>Your weekly selection of awesome robot videos</description>
           13       <category>robotics</category>
           14       <category>robotics/robotics-hardware</category>
           15       <pubDate>Fri, 30 Oct 2020 20:25:00 GMT</pubDate>
           16       <guid>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-attack-hexapod-robots</guid>
           17       <content:encoded><![CDATA[<p></p> 
           18 <p>Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (<a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a>!):</p> 
           19 <h5><a href="http://www.iros2020.org/">IROS 2020</a> –&nbsp;October 25-25, 2020 –&nbsp;[Online]</h5> 
           20 <h5><a href="https://roscon.ros.org/world/2020/">ROS World 2020</a> –&nbsp;November 12, 2020 –&nbsp;[Online]</h5> 
           21 <h5><a href="https://cybathlon.ethz.ch/en/">CYBATHLON 2020</a> –&nbsp;November 13-14, 2020 –&nbsp;[Online]</h5> 
           22 <h5><a href="https://sites.psu.edu/icsr2020/">ICSR 2020</a> –&nbsp;November 14-16, 2020 –&nbsp;Golden, Colo., USA</h5> 
           23 <p><a href="mailto:automaton@ieee.org?subject=Robot%20video%20suggestion%20for%20Video%20Friday">Let us know</a> if you have suggestions for next week, and enjoy today’s videos.</p> 
           24 <hr> 
           25 <!--nextpage--> 
           26 <p>Happy Halloween from HEBI Robotics!</p> 
           27 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/Is8C6imjgp0" width="620"></iframe></p> 
           28 <p><em>Thanks&nbsp;Hardik!</em></p> 
           29 <p>[ <a href="https://www.hebirobotics.com/">HEBI Robotics</a> ]</p> 
           30 <hr> 
           31 <p>Happy Halloween from Berkshire Grey!</p> 
           32 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/dT9j4h4rIm8" width="620"></iframe></p> 
           33 <p>[ <a href="https://www.berkshiregrey.com/">Berkshire Grey</a> ]</p> 
           34 <hr> 
           35 <blockquote> 
           36  <p><em>These are some preliminary results of our lab’s new work on using reinforcement learning to train neural networks to imitate common bipedal gait behaviors, without using any motion capture data or reference trajectories. Our method is described in an upcoming submission to ICRA 2021. Work by Jonah Siekmann and Yesh Godse.</em></p> 
           37 </blockquote> 
           38 <p></p> 
           39 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/Wb0tIWBrjmc" width="620"></iframe></p> 
           40 <p></p> 
           41 <p>[ <a href="https://mime.oregonstate.edu/research/drl/">OSU DRL</a> ]</p> 
           42 <p></p> 
           43 <hr> 
           44 <p></p> 
           45 <blockquote> 
           46  <p><em>The northern goshawk is a fast, powerful raptor that flies effortlessly through forests. This bird was the design inspiration for the next-generation drone developed by scientifics of the Laboratory of Intelligent Systems of EPFL led by Dario Floreano. They carefully studied the shape of the bird’s wings and tail and its flight behavior, and used that information to develop a drone with similar characteristics.</em></p> 
           47 </blockquote> 
           48 <p></p> 
           49 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/h5ELn3hGA0o" width="620"></iframe></p> 
           50 <p></p> 
           51 <blockquote> 
           52  <p><em>The engineers already designed a bird-inspired drone with morphing wing back in 2016. In a step forward, their new model can adjust the shape of its wing and tail thanks to its artificial feathers. Flying this new type of drone isn’t easy, due to the large number of wing and tail configurations possible. To take full advantage of the drone’s flight capabilities, Floreano’s team plans to incorporate artificial intelligence into the drone’s flight system so that it can fly semi-automatically. The team’s research has been published in Science Robotics.</em></p> 
           53 </blockquote> 
           54 <p>[ <a href="https://actu.epfl.ch/news/raptor-inspired-drone-with-morphing-wing-and-tail/">EPFL</a> ]</p> 
           55 <hr> 
           56 <p>Oopsie.</p> 
           57 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/x4fdUx6d4QM" width="620"></iframe></p> 
           58 <p>[ <a href="https://roborace.com/">Roborace</a> ]</p> 
           59 <hr> 
           60 <p></p> 
           61 <p>We’ve covered MIT’s Roboats in the past, but now <a href="/cars-that-think/transportation/marine/mit-unveils-a-roboat-big-enough-to-stand-on">they’re big enough to keep a couple of people afloat</a>.</p> 
           62 <p></p> 
           63 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/OYmVwvP_pD0" width="620"></iframe></p> 
           64 <p></p> 
           65 <blockquote> 
           66  <p><em>Self-driving boats have been able to transport small items for years, but adding human passengers has felt somewhat intangible due to the current size of the vessels. Roboat II is the “half-scale” boat in the growing body of work, and joins the previously developed quarter-scale Roboat, which is 1 meter long. The third installment, which is under construction in Amsterdam and is considered to be “full scale,” is 4 meters long and aims to carry anywhere from four to six passengers.</em></p> 
           67 </blockquote> 
           68 <p>[ <a href="https://news.mit.edu/2020/autonomous-boats-could-be-your-next-ride-1026">MIT</a> ]</p> 
           69 <p></p> 
           70 <hr> 
           71 <p></p> 
           72 <blockquote> 
           73  <p><em>With a training technique commonly used to teach dogs to sit and stay, Johns Hopkins University computer scientists showed a robot how to teach itself several new tricks, including stacking blocks. With the method, the robot, named Spot, was able to learn in days what typically takes a month.</em></p> 
           74 </blockquote> 
           75 <p></p> 
           76 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/dvxqjJBWFD4" width="620"></iframe></p> 
           77 <p></p> 
           78 <p>[ <a href="https://hub.jhu.edu/2020/10/26/positive-reinforcement-for-robots/">JHU</a> ]</p> 
           79 <p></p> 
           80 <hr> 
           81 <p></p> 
           82 <blockquote> 
           83  <p><em>Exyn, a pioneer in autonomous aerial robot systems for complex, GPS-denied industrial environments, today announced the first dog, Kody, to successfully fly a drone at Number 9 Coal Mine, in Lansford, PA. Selected to carry out this mission was the new autonomous aerial robot, the ExynAero.</em></p> 
           84 </blockquote> 
           85 <p></p> 
           86 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/BaL98fhuMmE" width="620"></iframe></p> 
           87 <p></p> 
           88 <p>Yes, this is obviously a publicity stunt, and Kody is only flying the drone in the sense that he’s pushing the launch button and then taking a nap. But that’s also the point— drone autonomy doesn’t get much fuller than this, despite the challenge of the environment.</p> 
           89 <p>[ <a href="https://www.exyn.com/news/kody-the-dog-selected-as-first-animal-drone-operator-after-successful-flight-in-pennsylvania">Exyn</a> ]</p> 
           90 <p></p> 
           91 <hr> 
           92 <p></p> 
           93 <blockquote> 
           94  <p><em>In this video object instance segmentation and shape completion are combined with classical regrasp planning to perform pick-place of novel objects. It is demonstrated with a UR5, <a href="https://robots.ieee.org/robots/robotiq/?utm_source=spectrum">Robotiq</a> 85 parallel-jaw gripper, and Structure depth sensor with three rearrangement tasks: bin packing (minimize the height of the packing), placing bottles onto coasters, and arrange blocks from tallest to shortest (according to the longest edge). The system also accounts for uncertainty in the segmentation/completion by avoiding grasping or placing on parts of the object where perceptual uncertainty is predicted to be high.</em></p> 
           95 </blockquote> 
           96 <p></p> 
           97 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/OBGf7L3iKsM" width="620"></iframe></p> 
           98 <p></p> 
           99 <p>[ <a href="https://www.ccs.neu.edu/home/mgualti/2021-Gualtieri-PickPlaceWithUncertainObjectInstanceSegmentationAndShapeCompletion.pdf">Paper</a> ] via [ <a href="https://www.khoury.northeastern.edu/research_areas/robotics/">Northeastern</a> ]</p> 
          100 <p><em>Thanks Marcus!</em></p> 
          101 <p></p> 
          102 <hr> 
          103 <p></p> 
          104 <p>U can’t touch this!</p> 
          105 <p></p> 
          106 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/KMmkcUkTC5w" width="620"></iframe></p> 
          107 <p></p> 
          108 <p>[ <a href="http://ishikawa-vision.org/">University of Tokyo</a> ]</p> 
          109 <p></p> 
          110 <hr> 
          111 <p></p> 
          112 <blockquote> 
          113  <p><em>We introduce a way to enable more natural interaction between humans and robots through Mixed Reality, by using a shared coordinate system. Azure Spatial Anchors, which already supports colocalizing multiple HoloLens and smartphone devices in the same space, has now been extended to support robots equipped with cameras. This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person’s perspective. We hope that this can be a building block in the future of humans and robots being collaborators and coworkers. </em></p> 
          114 </blockquote> 
          115 <p></p> 
          116 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/bhoNnqtte_M" width="620"></iframe></p> 
          117 <p></p> 
          118 <p>[ <a href="https://www.microsoft.com/en-us/research/video/enabling-interaction-between-mixed-reality-and-robots-via-cloud-based-localization/">Microsoft</a> ]</p> 
          119 <p></p> 
          120 <hr> 
          121 <p></p> 
          122 <p>Some very high jumps from the skinniest quadruped ever.</p> 
          123 <p></p> 
          124 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/57LDt7paOEw" width="620"></iframe></p> 
          125 <p></p> 
          126 <p>[ <a href="https://open-dynamic-robot-initiative.github.io/">ODRI</a> ]</p> 
          127 <p></p> 
          128 <hr> 
          129 <p></p> 
          130 <blockquote> 
          131  <p><em>In this video we present recent efforts to make our humanoid robot LOLA ready for multi-contact locomotion, i.e. additional hand-environment support for extra stabilization during walking.</em></p> 
          132 </blockquote> 
          133 <p></p> 
          134 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/mpDqMFppT68" width="620"></iframe></p> 
          135 <p></p> 
          136 <p>[ <a href="https://www.mw.tum.de/en/am/research/current-projects/robotics/humanoid-robot-lola/">TUM</a> ]</p> 
          137 <p></p> 
          138 <hr> 
          139 <p></p> 
          140 <p>Classic bike moves from Dr. Guero.</p> 
          141 <p></p> 
          142 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/j6bNVqe_1xY" width="620"></iframe></p> 
          143 <p></p> 
          144 <p>[ <a href="http://ai2001.ifdef.jp/">Dr. Guero</a> ]</p> 
          145 <p></p> 
          146 <hr> 
          147 <p></p> 
          148 <p>For a robotics company, iRobot is OLD.</p> 
          149 <p></p> 
          150 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/N0o0l_DjbSg" width="620"></iframe></p> 
          151 <p></p> 
          152 <p>[ <a href="https://www.irobot.com/">iRobot</a> ]</p> 
          153 <p></p> 
          154 <hr> 
          155 <p></p> 
          156 <blockquote> 
          157  <p><em>The Canadian Space Agency presents Juno, a preliminary version of a rover that could one day be sent to the Moon or Mars. Juno can navigate autonomously or be operated remotely. The Lunar Exploration Analogue Deployment (LEAD) consisted in replicating scenarios of a lunar sample return mission.</em></p> 
          158 </blockquote> 
          159 <p></p> 
          160 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/TmxafOXh7Xs" width="620"></iframe></p> 
          161 <p></p> 
          162 <p>[ <a href="https://www.asc-csa.gc.ca/eng/rovers/mission-simulations/lunar-exploration-analogue-deployment.asp">CSA</a> ]</p> 
          163 <p></p> 
          164 <hr> 
          165 <p></p> 
          166 <blockquote> 
          167  <p><em>How exactly does the <a href="https://robots.ieee.org/robots/waymo/?utm_source=spectrum">Waymo</a> Driver handle a cat cutting across its driving path? Jonathan N., a Product Manager on our Perception team, breaks it all down for us.</em></p> 
          168 </blockquote> 
          169 <p></p> 
          170 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/vtwFdQjj8N4" width="620"></iframe></p> 
          171 <p></p> 
          172 <p>Now do kangaroos.</p> 
          173 <p>[ <a href="https://waymo.com/">Waymo</a> ]</p> 
          174 <p></p> 
          175 <hr> 
          176 <p></p> 
          177 <p><a href="https://robots.ieee.org/robots/jibo/?utm_source=spectrum">Jibo</a> is hard at work at MIT playing games with kids.</p> 
          178 <p></p> 
          179 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/VyZDZbk1ebU" width="620"></iframe></p> 
          180 <p></p> 
          181 <blockquote> 
          182  <p><em>Children’s creativity plummets as they enter elementary school. Social interactions with peers and playful environments have been shown to foster creativity in children. Digital pedagogical tools often lack the creativity benefits of co-located social interaction with peers. In this work, we leverage a social embodied robot as a playful peer and designed Escape!Bot, a game involving child-robot co-play, where the robot is a social agent that scaffolds for creativity during gameplay.</em></p> 
          183 </blockquote> 
          184 <p>[ <a href="https://doi.org/10.1145/3383668.3419895">Paper</a> ]</p> 
          185 <p></p> 
          186 <hr> 
          187 <p></p> 
          188 <p>It’s nice when convenience stores are convenient even for the folks who <a href="/automaton/robotics/robotics-hardware/video-friday-telexistence-model-t-robot">have to do the restocking</a>.</p> 
          189 <p></p> 
          190 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/WLDucRUwJbo" width="620"></iframe></p> 
          191 <p></p> 
          192 <p>Who’s moving the crates around, though?</p> 
          193 <p>[ <a href="https://tx-inc.com/en/home/">Telexistence</a> ]</p> 
          194 <p></p> 
          195 <hr> 
          196 <p></p> 
          197 <blockquote> 
          198  <p><em>Hi, fans ! Join the ROS World 2020, opening November 12th , and see the footage of ROBOTIS’ ROS platform robots :)</em></p> 
          199 </blockquote> 
          200 <p></p> 
          201 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/j0PaIwk0fbs" width="620"></iframe></p> 
          202 <p></p> 
          203 <p>[ <a href="https://roscon.ros.org/world/2020/">ROS World 2020</a> ]</p> 
          204 <p></p> 
          205 <hr> 
          206 <p></p> 
          207 <blockquote> 
          208  <p><em>ML/RL methods are often viewed as a magical black box, and while that’s not true, learned policies are nonetheless a valuable tool that can work in conjunction with the underlying physics of the robot. In this video, Agility CTO Jonathan Hurst - wearing his professor hat at Oregon State University - presents some recent student work on using learned policies as a control method for highly dynamic legged robots.</em></p> 
          209 </blockquote> 
          210 <p></p> 
          211 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/QsvVM1NKQSQ" width="620"></iframe></p> 
          212 <p></p> 
          213 <p>[ <a href="https://www.agilityrobotics.com/">Agility Robotics</a> ]</p> 
          214 <p></p> 
          215 <hr> 
          216 <p></p> 
          217 <p>Here’s an ICRA Legged Robots workshop talk from Marco Hutter at ETH Zürich, on Autonomy for ANYmal.</p> 
          218 <p></p> 
          219 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/uzfozcY7NaE" width="620"></iframe></p> 
          220 <p></p> 
          221 <blockquote> 
          222  <p><em>Recent advances in legged robots and their locomotion skills has led to systems that are skilled and mature enough for real-world deployment. In particular, quadrupedal robots have reached a level of mobility to navigate complex environments, which enables them to take over inspection or surveillance jobs in place like offshore industrial plants, in underground areas, or on construction sites. In this talk, I will present our research work with the quadruped ANYmal and explain some of the underlying technologies for locomotion control, environment perception, and mission autonomy. I will show how these robots can learn and plan complex maneuvers, how they can navigate through unknown environments, and how they are able to conduct surveillance, inspection, or exploration scenarios.</em></p> 
          223 </blockquote> 
          224 <p>[ <a href="https://rsl.ethz.ch/">RSL</a> ]</p> 
          225 <p></p> 
          226 <hr> 
          227 <p></p>]]></content:encoded>
          228       <dc:creator>Evan Ackerman</dc:creator>
          229       <dc:creator>Erico Guizzo</dc:creator>
          230       <dc:creator>Fan Shi</dc:creator>
          231       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMjAyOQ.jpeg" />
          232       <media:content url="https://spectrum.ieee.org/image/MzcxMjAyOQ.jpeg" />
          233     </item>
          234     <item>
          235       <title>This Startup Spots Stress in Real-Time to Help Prevent Depression and Other Conditions</title>
          236       <link>https://spectrum.ieee.org/the-institute/ieee-member-news/this-startup-spots-stress-in-realtime-to-help-prevent-depression-and-other-conditions</link>
          237       <description>Philia Labs’ wearable measures physiological indicators to help manage stress</description>
          238       <category>the-institute</category>
          239       <category>the-institute/ieee-member-news</category>
          240       <pubDate>Fri, 30 Oct 2020 18:00:00 GMT</pubDate>
          241       <guid>https://spectrum.ieee.org/the-institute/ieee-member-news/this-startup-spots-stress-in-realtime-to-help-prevent-depression-and-other-conditions</guid>
          242       <content:encoded><![CDATA[<style type="text/css">.entry-content .tisubhead {
          243     color: #999999;
          244     font-family: verdana;
          245     font-size: 14px;
          246     font-weight: bold;
          247     letter-spacing: 1px;
          248     margin-bottom: -5px !important;
          249     text-transform: uppercase;
          250 }
          251 .tiopener {
          252     color: #0f4994;
          253     font-family: Theinhardt-Medium, sans-serif;
          254   letter-spacing: 1px;
          255   margin-right: 10px;
          256     font-weight: bold;
          257     text-transform: uppercase;
          258 }
          259 </style> 
          260 <figure class="xlrg" role="img"> 
          261  <img alt="Illustration of a smart watch with emojis coming out of it" src="/image/MzcxMTE4NQ.jpeg"> 
          262  <figcaption class="hi-cap">
          263    Illustration: iStockphoto/IEEE Spectrum 
          264  </figcaption> 
          265 </figure> 
          266 <p><span class="tiopener">THE INSTITUTE </span>By any measurement, 2020 has been stressful for just about everyone because of the COVID-19 pandemic. Fear about the virus and concerns about our health and that of loved ones can be overwhelming. Add that to the other tensions many of us have at work, at home, and at school.</p> 
          267 <p>When a person is stressed enough, the <a href="https://www.livescience.com/65446-sympathetic-nervous-system.html">fight-or-flight response</a> kicks in. The sympathetic nervous system causes a sudden release of hormones—which increases heart rate, blood pressure, and perspiration.</p> 
          268 <p>The first step in controlling stress is to know its symptoms, but because most people are used to some stress, they don’t realize how bad things have gotten until they reach a breaking point. Over time they could experience serious health problems such as heart disease, high blood pressure, and diabetes, as well as depression and other mental health woes, according to the U.S.&nbsp;<a href="https://www.nimh.nih.gov/health/publications/stress/index.shtml">National Institute of Mental Health</a>. More than <a href="https://www.who.int/news-room/fact-sheets/detail/depression">264&nbsp;million people suffer from depression</a>, the <a href="https://www.who.int/">World Health Organization</a> reports.</p> 
          269 <p>What if there was a way to measure in real time when a person was becoming stressed, so the condition could be managed immediately using evidence-based methods? That’s the idea behind <a href="https://philialabs.com.au/">Philia Labs</a>, a startup in Melbourne, Australia, that has developed a platform with a wearable device designed to measure physiological stress indicators.</p> 
          270 <p>The product is aimed at health care providers and mental health professionals, as well as people who want to monitor their own stress level.</p> 
          271 <p>“We are quantifying stress in the body in real time,” says Dilpreet Buxi, the startup’s cofounder and chief executive. “The hardware platform and software will enable interventions both through a health care provider and by the patient to basically enable better health outcomes and a better quality of life.”</p> 
          272 <p></p> 
          273 <h3 class="tisubhead">STRESS INDICATORS</h3> 
          274 <p>To confirm whether someone suffers from stress, Buxi says, doctors typically use a questionnaire such as the <a href="https://www.tac.vic.gov.au/files-to-move/media/upload/k10_english.pdf">Kessler Psychological Distress Scale</a> or the <a href="http://www2.psy.unsw.edu.au/dass/">Depression Anxiety Stress Scales</a>. Such forms help assess a person’s emotional state and quality of life based on situations that might trigger anxiety. But because they are self-evaluations, the results can be inaccurate.</p> 
          275 <p>Some of today’s fitness wearables claim to measure stress. They use data about heart rate, sleep, and level of activity to infer how stress is affecting the wearer. But, Buxi says, the results from such devices haven’t been clinically validated.</p> 
          276 <p>In contrast, he says, Philia aims to measure physiological data that has been shown to more closely align with stress response and to pursue focused clinical testing. Philia’s wearable, which is worn on the wrist for at least six months, uses optical sensors to measure heart rate and blood flow. Electrodes measure “galvanic skin response”—changes in moisture caused by sweat-gland activity that can indicate a person’s emotional state, Buxi says.</p> 
          277 <p>“<em>Galvanic skin response</em> refers to the electrical conductivity of the skin,” he says. “In other words, when you break out into a nervous sweat, the electrical conductivity will change.”</p> 
          278 <figure class="xlrg" role="img"> 
          279  <img alt="Philia Labs graphic showing acute stress" src="/image/MzcxMTE5NQ.jpeg"> 
          280  <figcaption class="hi-cap">
          281    Image: Philia Labs 
          282  </figcaption> 
          283  <figcaption>
          284    Data on acute stress over a 24-hour time period collected with Philia Labs’ platform. 
          285  </figcaption> 
          286 </figure> 
          287 <p>Philia will initially pilot its technology on patients undergoing depression treatment, he says, adding that a clinician will prescribe the device and a clinical monitoring program for the patient. Physiological and self-reported data are captured from the patient’s sympathetic arousal—that fight-or-flight response—and computed. Trends in sympathetic arousal activity over weeks and months are calculated to determine whether a patient requires an intervention such as a change of medication or psychosocial treatment. All the information is stored in the cloud.</p> 
          288 <p>For patients who previously have had depression, early intervention could help reduce the risk of a recurrence, Buxi says.</p> 
          289 <p>“According to our conversations with psychiatrists,” he says, “stress that results in sympathetic arousal is a leading cause of relapse and needs to be monitored in order for the psychiatrist to intervene earlier.”</p> 
          290 <p>He says the likelihood that a person who has recovered from depression will relapse in the first year when suffering from stress is 20&nbsp;percent to 50&nbsp;percent.</p> 
          291 <p>“The platform will enable the medical provider to make better decisions,” he says. For patients, he adds, “the goal is to basically help them adopt better techniques for stress management.”</p> 
          292 <p>Philia has several partners including medical institutions and research universities. It is running pilot programs with 11&nbsp;health care and wellness organizations. The company has filed a provisional patent application.</p> 
          293 <figure class="rt med" role="img"> 
          294  <img alt="Verisense wearable" src="/image/MzcxMTIwNA.jpeg"> 
          295  <figcaption class="hi-cap">
          296    Photo: Verisense 
          297  </figcaption> 
          298  <figcaption>
          299    Philia Labs is using the Verisense IMU sensor to track patients’ symptoms of stress. 
          300  </figcaption> 
          301 </figure> 
          302 <p>The startup has a proof-of-concept prototype for the wearable, which is built using off-the-shelf parts and is moving to a minimum viable product that will be used after a study and several trial programs are completed next year, Buxi says. A lab study on 60 patients is currently happening and will end in April. A small trial on those with mild depression patients starts in January, and a multi-site trial in depression relapse will begin in June. He says the trial is with a corporate health provider, which can expand the company’s market portfolio to non-clinicians.</p> 
          303 <p>The company will be seeking regulatory approval for the platform after it undergoes clinical trials.</p> 
          304 <h3 class="tisubhead">INSPIRATION</h3> 
          305 <p>A biomedical engineer, Buxi worked from 2008 to 2012 at the <a href="https://www.holstcentre.com/">Holst</a> research center in Eindhoven, the Netherlands, where he integrated state-of-the-art technologies for wearable health care devices. After that, he relocated with his family to Australia, where he pursued a Ph.D. at <a href="https://www.monash.edu/">Monash University</a> in Melbourne. For his research-project thesis, he developed a wearable blood pressure monitoring system based on pulse transit time—for which he was granted a <a href="https://sourceip.ipaustralia.gov.au/patent/cuffless-blood-pressure-monitoring-system-au2015903504/AVEFOpozNpW9jtTZMGkt">patent</a> from the Australian government.</p> 
          306 <p><a href="https://ieeexplore.ieee.org/author/37601391600">Several of his research papers</a> are published in the IEEE Xplore Digital Library.</p> 
          307 <p>Buxi got to thinking whether he might apply his Ph.D. work to the problem of measuring stress.</p> 
          308 <figure class="xlrg" role="img"> 
          309  <img alt="Left: Dilpreet Buxi; Right: Alexander Senior" src="/image/MzcxMTE4Nw.jpeg"> 
          310  <figcaption class="hi-cap">
          311    Photos: Philia Labs 
          312  </figcaption> 
          313  <figcaption>
          314    Dilpreet Buxi (left) and Alexander Senior, co-founders of Philia Labs. 
          315  </figcaption> 
          316 </figure> 
          317 <p>He began working on the idea in 2017 as a side project, and in 2018 he formed a proprietary limited partnership with the startup’s cofounder, <a href="https://philialabs.com.au/">Alexander Senior</a>. Today the company has seven employees—a mix of engineers, scientists, and entrepreneurs. The company also has collaborators from industry and academia who have expertise in machine learning, biomedical machine learning, and physiology.</p> 
          318 <p>The business has largely been funded by a venture capitalist and is close to completing its seed funding round.</p> 
          319 <p></p> 
          320 <h3 class="tisubhead">LEARNING TO BE AN ENTREPRENEUR</h3> 
          321 <p>Buxi says his biggest challenge was transitioning from being an engineer and scientist to becoming an entrepreneur.</p> 
          322 <p>“You need to think in terms of what is the problem you’re solving that requires a solution that somebody is going to pay money for,” he says. “That’s completely different from doing an investigation in the lab.” As an entrepreneur, “you have to find a solution where you can repeatedly get new and old customers to pay [so that you have] new and recurring revenue.</p> 
          323 <p>“That took a lot of learning,” he says. “In fact, even today, I think more commercially, but I’m still pretty academic. And sometimes it shows.”</p> 
          324 <p>He says he got help with how to run a startup from IEEE’s <a href="https://entrepreneurship.ieee.org/founder-office-hours-mentors-investors/">Founder Office Hours program</a>, which seeks to assist early- and growth-stage technology entrepreneurs from the IEEE community. It connects entrepreneurs to mentors who can provide feedback and potentially help them grow their company.</p> 
          325 <p>In <a href="https://entrepreneurship.ieee.org/2020_06_12_testimonial-office-hours/">a testimonial</a> about the program, Buxi says he got assistance with validating the product, thinking about the pros and cons of various business models, and refining an intellectual-property strategy to create value.</p> 
          326 <p>“The program shaped our thinking a bit,” he says, “to make our approach more practical.”</p>]]></content:encoded>
          327       <dc:creator>Kathy Pretz</dc:creator>
          328       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMTE1MA.jpeg" />
          329       <media:content url="https://spectrum.ieee.org/image/MzcxMTE1MA.jpeg" />
          330     </item>
          331     <item>
          332       <title>When X-Rays Were All the Rage, a Trip to the Shoe Store Was Dangerously Illuminating</title>
          333       <link>https://spectrum.ieee.org/tech-history/heroic-failures/when-xrays-were-all-the-rage-a-trip-to-the-shoe-store-was-dangerously-illuminating</link>
          334       <description>The shoe-fitting fluoroscope was unnecessary and hazardous, but kids loved it</description>
          335       <category>tech-history</category>
          336       <category>tech-history/heroic-failures</category>
          337       <pubDate>Fri, 30 Oct 2020 15:00:00 GMT</pubDate>
          338       <guid>https://spectrum.ieee.org/tech-history/heroic-failures/when-xrays-were-all-the-rage-a-trip-to-the-shoe-store-was-dangerously-illuminating</guid>
          339       <content:encoded><![CDATA[<figure class="xlrg" role="img"> 
          340  <img alt="Photo of shoe fluoroscope." src="/image/MzcwNzM3OQ.jpeg"> 
          341  <div class="ai"> 
          342   <figcaption class="hi-cap">
          343     Photo: ORAU 
          344   </figcaption> 
          345  </div> 
          346 </figure> 
          347 <p>How do those shoes fit? Too tight in the toes? Too wide in the heel? Step right up to the Foot-O-Scope to eliminate the guesswork and take a scientific approach to proper shoe fitting!</p> 
          348 <p>When the German engineer and physicist <a href="https://ethw.org/Wilhelm_Roentgen">Wilhelm Röntgen</a> accidentally discovered a mysterious light that would pass through most substances and leave behind a ghostly image of an object’s interior, I doubt he had shoes in mind. Indeed, he didn’t even know what the light was, so he called it “X-rays,” the “X” standing for the unknown. That name stuck for English speakers, although in many languages they’re known as Röntgen rays. 8 November marks the 125th anniversary of his discovery.</p> 
          349 <figure class="rt med" role="img"> 
          350  <img alt="Photo of a shoe fluoroscope." src="/image/MzcwNzM5OQ.jpeg"> 
          351  <div class="ai"> 
          352   <figcaption class="hi-cap">
          353     Photo: Oak Ridge Associated Universities 
          354   </figcaption> 
          355   <figcaption>
          356     From the 1920s through the 1950s, thousands of shoe stores in North American and Europe touted their shoe-fitting fluoroscopes, which produced X-rays of customers’ feet. 
          357   </figcaption> 
          358  </div> 
          359 </figure> 
          360 <p>Röntgen published his findings on 28 December 1895, and within a month, “On a New Kind of Rays” had been translated into English and <a href="https://www.nature.com/articles/053274b0">published in <em>Nature</em></a>. Three weeks after that, <a href="https://science.sciencemag.org/content/3/59/227"><em>Science </em>reprinted it</a><em>. </em>Word also spread quickly in the popular press about this wondrous light that allowed you to see inside the human body. Similar to Marie and Pierre Curie, Röntgen refused to take out any patents so that humanity could benefit from this new method for querying nature. Scientists, engineers, and medical doctors dove into X-ray research headlong.</p> 
          361 <p>Experimenters quickly realized that X-rays could produce still images, called radiographs, as well as moving images. The object of interest was placed between an X-ray beam and a fluorescent screen. Röntgen had been experimenting with cathode rays and Crookes tubes when he first saw the glow on a screen coated with barium platinocyanide. It took a few weeks of experimenting to capture clear images on a photographic plate. His first X-ray image was of his wife’s hand, distinctly showing the bones and a ring.</p> 
          362 <p>Viewing a moving image was simpler: You just looked directly at the fluorescent screen. Thomas Edison, an early X-ray enthusiast, coined the term fluoroscopy for this new technique, which was developed simultaneously in February 1896 in Italy and the United States.</p> 
          363 <figure class="xlrg" role="img"> 
          364  <img alt="img" src="/image/MzcwNzM5OA.jpeg"> 
          365  <div class="ai"> 
          366   <figcaption class="hi-cap">
          367     Photo: SPL/Science Source 
          368   </figcaption> 
          369   <figcaption> 
          370    <p>A popular 1896 textbook featured a radiograph of a woman’s foot inside a boot.</p> 
          371   </figcaption> 
          372  </div> 
          373 </figure> 
          374 <p>Less than a year after Röntgen’s discovery, William Morton, a medical doctor, and Edwin W. Hammer, an electrical engineer, rushed to publish <em>The X-Ray; or Photography of the Invisible and Its Value in Surgery</em>, which described the necessary apparatus and techniques to produce radiographs. Among the book’s numerous illustrations was a radiograph of a woman’s foot inside a boot<em>. </em>Morton and Hammer’s textbook became popular among surgeons, doctors, and dentists eager to apply this new technology.</p> 
          375 <h2><strong>From early on, feet in shoes were a popular X-ray subject</strong></h2> 
          376 <p>A push from the military during World War I helped establish the fluoroscope for shoe fitting. In his highly regarded 1914 publication <em>A Textbook of Military Hygiene and Sanitation</em>, for instance,<em> </em>Frank Keefer included radiographs of feet in boots to highlight proper and ill-fitting footwear. But Keefer stopped short of recommending that <em>every</em> soldier’s foot be imaged to check for fit, as <a href="https://jacalynduffin.ca/">Jacalyn Duffin</a> and <a href="https://charleshayter.com/about">Charles R. R. Hayter</a> (both historians and medical doctors) detail in their article <a href="https://www.jstor.org/stable/236916">“Baring the Sole: The Rise and Fall of the Shoe-Fitting Fluoroscope”</a><span class="MsoHyperlink"> (<em>Isis</em>, June 2000).</span></p> 
          377 <p>Jacob J. Lowe, a doctor in Boston, used fluoroscopy to examine the feet of wounded soldiers without removing their boots. When the war ended, Lowe adapted the technology for shoe shops, and he filed for a <a href="https://patents.google.com/patent/US1614988A/en">U.S. patent</a> in 1919, although it wasn’t granted until 1927. He named his device the Foot-O-Scope. Across the Atlantic, inventors in England applied for a British patent in 1924, which was awarded in 1926. Meanwhile, Matthew B. Adrian, inventor of the shoe fitter shown at top, <a href="https://patents.google.com/patent/US1642915A/">filed a patent claim</a> in 1921, and it was granted in 1927.</p> 
          378 <p>Before long, two companies emerged as the leading producers of shoe-fitting fluoroscopes: the Pedoscope Co. in England and X-Ray Shoe Fitter Inc. in the United States. The basic design included a large wooden cabinet with an X-ray tube in its base and a slot where customers would place their shoe-clad feet. When the sales clerk flipped the switch to activate the X-ray stream, the customer could view the image on a fluorescent screen, showing the bones of the feet and the outline of the shoes. The devices usually had three eyepieces so that the clerk, customer, and a third curious onlooker (parent, spouse, sibling) could all view the image simultaneously.</p> 
          379 <p>The machines were heralded as providing a more “scientific” method of fitting shoes. Duffin and Hayter argue, however, that shoe-fitting fluoroscopy was first and foremost an elaborate marketing scheme to sell shoes. If so, it definitely worked. My mother fondly remembers her childhood trips to Wenton’s on Bergen Avenue in Jersey City to buy saddle shoes. Not only did she get to view her feet with the fancy technology, but she was given a shoe horn, balloon, and lollipop. Retailers banked on children begging their parents for new shoes.</p> 
          380 <h2><strong>Radiation risks from shoe-fitting fluoroscopes were largely ignored</strong></h2> 
          381 <p>Although the fluoroscope appeared to bring scientific rigor to the shoe-fitting process, there was nothing medically necessary about it. My mother grudgingly acknowledges that the fluoroscope didn’t help her bunions in the least. Worse, the unregulated radiation exposure put countless customers and clerks at risk for ailments including dermatitis, cataracts, and, with prolonged exposure, cancer.</p> 
          382 <p>The amount of radiation exposure depended on several things, including the person’s proximity to the machine, the amount of protective shielding, and the exposure time. A typical fitting lasted 20 seconds, and of course some customers would have several fittings before settling on just the right pair. The first machines were unregulated. In fact, the roentgen (R) didn’t become the internationally accepted unit of radiation until 1928, and the first systematic survey of the machines wasn’t undertaken until 20 years later. That 1948 study of 43 machines in Detroit showed ranges from 16 to 75 roentgens per minute. In 1946, the American Standards Association had issued a safety code for industrial use of X-rays, limiting exposure to 0.1 R per day.</p> 
          383 <figure class="xlrg" role="img"> 
          384  <img alt="Certificates issued to customers highlighted the shoe-fitting fluoroscope’s scientific approach." src="/image/MzcwNzQwMA.jpeg"> 
          385  <div class="ai"> 
          386   <figcaption class="hi-cap">
          387     Photo: Oak Ridge Associated Universities 
          388   </figcaption> 
          389   <figcaption>
          390     Certificates issued to customers highlighted the shoe-fitting fluoroscope’s scientific approach. 
          391   </figcaption> 
          392  </div> 
          393 </figure> 
          394 <p>But some experts had warned about the dangers of X-rays early on. Edison was one. He was already an established inventor when Röntgen made his discovery, and for several years, Edison’s lab worked nonstop on X-ray experiments. That work came to a halt with the decline and eventual death of Clarence M. Dally.</p> 
          395 <p>Dally, a technician in Edison’s lab, ran numerous tests with the fluoroscope, regularly exposing himself to radiation for hours on end. By 1900 he had developed lesions on his hands. His hair began to fall out, and his face grew wrinkled. In 1902, his left arm had to be amputated, and the following year his right arm. <a href="https://timesmachine.nytimes.com/timesmachine/1904/10/04/101240949.html?pageNumber=16">He died in 1904</a> at the age of 39 from metastatic skin cancer. <em>The New York Times</em> called him “a martyr to science.” <a href="https://www.smithsonianmag.com/history/clarence-dally-the-man-who-gave-thomas-edison-x-ray-vision-123713565/">Edison famously stated</a>, “Don’t talk to me about X-rays. I am afraid of them.”</p> 
          396 <p>Clarence Dally may have been the first American to die of radiation sickness, but by 1908 the American Roentgen Ray Society reported 47 fatalities due to radiation. In 1915 the Roentgen Society of Great Britain issued guidelines to protect workers from overexposure to radiation. These were incorporated into recommendations made in 1921 by the British X-Ray and Radium Protection Committee, a group with a similar mission. Comparable guidelines were established in the United States in 1922.</p> 
          397 <p>For those concerned about radiation exposure, the shoe-fitting fluoroscope seemed a dangerous machine. Christina Jordan was the wife of Alfred Jordan, a pioneer in radiographic disease detection, and in 1925, <a href="https://www.thetimes.co.uk/archive/article/1925-12-29/11/4.html">she wrote a letter</a> to <em>The</em> <em>Times</em> of London decrying the dangerous levels of X-ray radiation to which store clerks were being exposed. Jordan noted that while a scientist who dies of radiation sickness is celebrated as “a martyr to science,” a “‘martyr to commerce’ stands on a different footing.”</p> 
          398 <p>Charles H. Baber, a merchant on Regent Street who claimed to be the first shoe retailer to use X-rays, <a href="https://www.thetimes.co.uk/archive/article/1925-12-30/6/9.html">replied with a letter</a> the next day. Having used the machine since 1921, he wrote, he saw no harm to himself or his employees. <em>The Times</em> also ran <a href="https://www.thetimes.co.uk/archive/article/1925-12-31/6/5.html">a letter from J. Edward Seager</a> of X-Rays Limited (as the Pedoscope’s manufacturer was then called), noting that the machine had been tested and certified by the National Physical Laboratory. This fact, he wrote, “should be conclusive evidence that there is no danger whatever to either assistants or users of the pedoscope.”</p> 
          399 <p>And that, seemingly, was that. The shoe-fitting fluoroscope flourished in the retail landscape with virtually no oversight. By the early 1950s, an estimated 10,000 machines were operating in the United States, 3,000 in the United Kingdom, and 1,000 in Canada.</p> 
          400 <p>After World War II and the dropping of the atomic bombs, though, Americans began to pull back from their love of all things irradiating. The shoe-fitting fluoroscope did not escape notice. As mentioned, the American Standards Association issued guidance on the technology in 1946, and reports published in the <em>Journal of the American Medical Association </em>and the <em>New England Journal of Medicine</em> also raised the alarm. States began passing legislation that the machines could be operated only by licensed physicians, and in 1957, Pennsylvania banned them entirely. But as late as 1970, 17 states still allowed them. Eventually, a few specimens made their way into museum collections; the <a href="https://www.orau.org/ptp/collection/shoefittingfluor/shoe.htm">one at top</a> is from the <a href="https://www.orau.org/ptp/museumdirectory.htm">Health Physics Historical Instrumentation Museum Collection</a> at the Oak Ridge Associated Universities.</p> 
          401 <p>This video by the U.S. Food and Drug Administration nicely captures how regulators finally caught up with the machine:</p> 
          402 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/cfALJUSmzzk" width="620"></iframe></p> 
          403 <p></p> 
          404 <p>The shoe-fitting fluoroscope is a curious technology. It seemed scientific but it wasn’t. Its makers claimed it wasn’t dangerous, but it was. In the end, it proved utterly superfluous—a competent salesperson could fit a shoe just as easily and with less fuss. And yet I understand the allure. I’ve been scanned for insoles to help my overpronated feet. I’ve been videotaped on a treadmill to help me select running shoes. Was that science? Did it help? I can only hope. I’m pretty sure at least that it did no harm.</p> 
          405 <p></p> 
          406 <p><em>An abridged version of this article appears in the November 2020 print issue as “If the X-Ray Fits.”</em></p> 
          407 <p><em>Part of a </em><a href="/tag/Past+Forward"><em>continuing series</em></a> <em>looking at photographs of historical artifacts that embrace the boundless potential of technology.</em></p> 
          408 <h2></h2> 
          409 <h2>About the Author</h2> 
          410 <p><a href="https://www.sc.edu/study/colleges_schools/artsandsciences/history/our_people/directory/marsh_allison.php">Allison Marsh</a> is an associate professor of history at the University of South Carolina and codirector of the university’s Ann Johnson Institute for Science, Technology &amp; Society.</p> 
          411 <p></p> 
          412 <p></p> 
          413 <p></p>]]></content:encoded>
          414       <dc:creator>Allison Marsh</dc:creator>
          415       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNzM3OQ.jpeg" />
          416       <media:content url="https://spectrum.ieee.org/image/MzcwNzM3OQ.jpeg" />
          417     </item>
          418     <item>
          419       <title>Metal Spheres Swarm Together to Create Freeform Modular Robots</title>
          420       <link>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/freebots-spheres-swarm-robots</link>
          421       <description>FreeBOTs use magnets and internal motors to roll around or stick together</description>
          422       <category>robotics</category>
          423       <category>robotics/robotics-hardware</category>
          424       <pubDate>Fri, 30 Oct 2020 02:16:00 GMT</pubDate>
          425       <guid>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/freebots-spheres-swarm-robots</guid>
          426       <content:encoded><![CDATA[<p>Swarms of modular, self-reconfigurable robots&nbsp;have a lot going for them, at least in theory— they’re resilient and easy to scale, since big robots can be made on demand from lots of little robots. One of the trickiest&nbsp;bits about modular robots is figuring out a simple and reliable way of getting them to connect to each other, without having to rely on some kind of dedicated connectivity system.</p> 
          427 <p>This week at the&nbsp;IEEE/RSJ International Conference on Intelligent Robots (IROS), a research team&nbsp;at the Chinese University of Hong Kong, Shenzhen, led by&nbsp;<a href="https://myweb.cuhk.edu.cn/tllam">Tin Lun Lam</a> is presenting a new kind of modular robot that solves this problem by using little robotic vehicles inside of iron spheres that can stick together wherever you need them to.</p> 
          428 <!--nextpage--> 
          429 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/23I2ms6Wti4?rel=0" width="620"></iframe></p> 
          430 <p>Typically, modular robots are relatively complicated and finicky things, because the connections between them have to combine power, communications, and physical support, leading to robot-to-robot interfaces that are relatively complicated. And we usually see modular robots that emphasize reconfigurability as opposed to any kind of inherent single-module capability. Swarm robots, on the other hand, do emphasize single robot capability, although the single robots are intended to be most useful as part of a large swarm.</p> 
          431 <figure class="xlrg" role="img"> 
          432  <img alt="Freebot robot" src="/image/MzcxMTU3Nw.jpeg"> 
          433  <figcaption class="hi-cap">
          434    Photo: Chinese University of Hong Kong-Shenzhen 
          435  </figcaption> 
          436  <figcaption>
          437    The internal mechanism that FreeBOT uses to move includes a motor and a magnet. 
          438  </figcaption> 
          439 </figure> 
          440 <p>FreeBOT is a sort of hybrid between these two robotic concepts. Each FreeBOT module consists of an iron sphere, inside of which is a little vehicle of sorts with two motorized wheels and a permanent magnet. The magnet keeps the vehicle stuck to the inside of the sphere, and when the wheels spin, it causes the shell to roll forward or backward. Driving the wheels independently turns the shell. If this looks familiar, it could be because the popular Sphero robots have the same basic design. A single module can do a fair amount on its own, with good mobility and some neat tricks around ferromagnetic surfaces.&nbsp;</p> 
          441 <figure class="xlrg" role="img"> 
          442  <img alt="Freebot robots connecting and disconnecting" src="/image/MzcxMTU3OA.jpeg"> 
          443  <figcaption class="hi-cap">
          444    Photo: Chinese University of Hong Kong-Shenzhen 
          445  </figcaption> 
          446  <figcaption> 
          447   <p>How two FreeBOTs connect to one another and&nbsp;separate from each other.</p> 
          448  </figcaption> 
          449 </figure> 
          450 <p>Since each robot has a ferromagnetic shell plus an internal permanent magnet, attaching one robot to another robot is relatively simple. Two robots can touch each other without connecting, since the iron shells are not permanent magnets. To make the attachment, the permanent magnet on the bottom of the little internal vehicle has to get close to the point at which the two spheres are touching, and when it does, the permanent magnet excites a magnetic field in the shells of both robots, causing them to stick together. The exact alignment is very forgiving, and the connection can happen absolutely anywhere on each robot, which is far more versatile than just about any other modular robotic system. Disconnecting simply involves moving the internal vehicle away from the connection point, which removes the magnetic field. Combining multiple FreeBOTs is where things get interesting, since it’s possible to create blobs of robots or chains of robots or use a small pile of robots to help one module overcome obstacles. Ferromagnetic surfaces can be leveraged even more by a swarm than by a single module.</p> 
          451 <p>There are some constraints to the current generation of FreeBOTs; most significantly, they’re remote controlled, without much in the way of onboard sensors (or any obvious way of adding them). Recharging the batteries also seems like it might be difficult. The researchers are working on ways of making the swarm of FreeBOTs at least somewhat autonomous, though, and they say that they plan to make a whole bunch more of them “to fully demonstrate the enormous potential of FreeBOT.” Here’s a peek of what a bunch of FreeBOTs can do:</p> 
          452 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/OR0qrzkkYh4?rel=0" width="620"></iframe></p> 
          453 <p>“FreeBOT: A Freeform Modular Self-reconfigurable Robot with Arbitrary Connection Point - Design and Implementation,” by Guanqi Liang, Haobo Luo, Ming Li, Huihuan Qian, and Tin Lun Lam from the <a href="https://www.cuhk.edu.cn/en">Chinese University of Hong Kong, Shenzhen</a>, will be presented at IROS 2020.</p>]]></content:encoded>
          454       <dc:creator>Evan Ackerman</dc:creator>
          455       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMTU4MA.jpeg" />
          456       <media:content url="https://spectrum.ieee.org/image/MzcxMTU4MA.jpeg" />
          457     </item>
          458     <item>
          459       <title>Are Electronic Media Any Good at Getting Out the Vote?</title>
          460       <link>https://spectrum.ieee.org/podcast/at-work/education/are-electronic-media-any-good-at-getting-out-the-vote</link>
          461       <description>An expert on maximizing voter participation says other methods are better and more cost-effective</description>
          462       <category>at-work</category>
          463       <category>at-work/education</category>
          464       <pubDate>Thu, 29 Oct 2020 20:30:00 GMT</pubDate>
          465       <guid>https://spectrum.ieee.org/podcast/at-work/education/are-electronic-media-any-good-at-getting-out-the-vote</guid>
          466       <content:encoded><![CDATA[<style type="text/css">.detail-wrapper .article-detail .media-wrapper .buttons {
          467   font-family: Theinhardt-Regular,sans-serif;
          468   display: none;
          469   clear: both;
          470   padding: 10px 0;
          471   overflow: hidden;
          472 }
          473 .article-detail article iframe, .article-detail iframe {
          474    display: block;
          475    margin: auto;
          476    margin-bottom: 40px;
          477 }
          478 </style> 
          479 <iframe frameborder="no" height="180" scrolling="no" seamless src="https://share.transistor.fm/e/5d736fb3" width="100%"></iframe> 
          480 <p><strong>Steven Cherry </strong>Hi, this is Steven Cherry for Radio Spectrum.</p> 
          481 <p>For some years and still today, there’s been a quiet but profound schism among political strategists. There are those who favor modern methods and modern media—mass mailings, robocalling, television advertising, and, increasingly, social-media advertising. On the other hand are those, including my guest today, who not only still see a value in traditional person-to-person messaging, but see it as, frequently, the better bang for the campaign buck.</p> 
          482 <p>Just last week <em>[this was recorded Oct 5, 2020—Ed.]</em> the attorney general of Michigan— a state that has been a battleground, not just for electoral delegates, but this methodological dispute—announced that two political operatives were charged with felonies in connection with robocalls that made a number of false claims about the risks of voting by mail, in an apparent attempt to discourage residents of Detroit from voting by mail. And last week as well, the Biden campaign announced <a href="https://www.politico.com/news/2020/10/01/biden-flip-flops-on-door-knocking-with-33-days-left-424642">a complete turnaround on the question of door-to-door canvassing</a>, perhaps the gold standard of person-to-person political campaigning. Are they perhaps afraid of Democratic standard-bearers making the same mistake twice?</p> 
          483 <p>In the endless post-mortem of the 2016 Presidential election, an <a href="https://www.politico.com/story/2016/12/michigan-hillary-clinton-trump-232547">article in Politico</a> argued that the Clinton campaign was too data-driven and model-driven, and refused local requests, especially in Michigan, for boots-on-the-ground support. It quoted a longtime political hand in Michigan as describing quote “months of failed attempts to get attention to the collapse she was watching unfold in slow-motion among women and African-American millennials.”</p> 
          484 <p>I confess I saw something of that phenomenon on a recent Saturday. I’m living in Pittsburgh these days, and in the morning, I worked a Pennsylvania-based phone bank for my preferred political party. One of my first calls was to someone in the Philadelphia area, who told me he had already made his absentee ballot request and asked, while he had me on the phone, when his ballot would come. “There used to be someone around here I forget what you call her but someone I could ask stuff of.” That was strike one.</p> 
          485 <p>In another call, to a man in the Erie area, the conversation turned to yard signs. He said he would like to put one out but he had no idea where to get it. Strike two. In the late afternoon, two of us went to a neighborhood near us to put out door-hangers, and if we saw someone face-to-face we would ask if they wanted a yard sign. One fellow said he would. “We were supposed to get one,” he told us. When he saw we had a stack of them in our car, he sheepishly added, “We were supposed to get two in fact, one for a friend.” That was my third indication in one day that there was a lack of political party involvement at the very local level—in three different parts of what could well be the most critical swing state of the 2020 Presidential election.</p> 
          486 <p>When I strung these three moments together over a beer, my partner immediately thought of a book she owned, <a href="https://www.brookings.edu/book/get-out-the-vote-2/"><em>Get Out the Vote</em></a><em>, </em>now in its fourth edition<em>. </em>Its authors, Donald Green and Alan Gerber, argue that political consultants and campaign managers have underappreciated boots-on-the-ground canvassing in person and on the phone, in favor of less personal, more easily-scaled methods—radio and TV advertising, robocalling, mass mailings, and the like.</p> 
          487 <p>Of particular interest, they base their case with real data, based on experimental research. The first edition of their book described a few dozen such experiments; their new edition, they say, summarizes hundreds.</p> 
          488 <p>One of those authors is <a href="https://polisci.columbia.edu/content/donald-p-green">Donald Green</a>, a political scientist at Columbia University focusing on such issues as voting behavior and partisanship, and most importantly, methodologies for studying politics and elections. His teaching career started at Yale University, where he directed its Institution for Social and Policy Studies. He joins us via Skype.</p> 
          489 <p><strong>Steven Cherry </strong>Don, welcome to the podcast.</p> 
          490 <p><strong>Donald Green </strong>Thank you very much for having me.</p> 
          491 <p><strong>Steven Cherry </strong>Modern campaigns can employ an army of advisers, consultants, direct mail specialists, phone bank vendors, and on and on. You say that much of the advice candidates get from these professionals comes from war stories and not evidence. Robocalls seem to be one example of that. The study of a 2006 Texas primary found that 65 000 calls for one candidate increased his vote share by about two votes.</p> 
          492 <p><strong>Donald Green </strong>Yes, the robocalls have an almost perfect record of never working in randomized trials. These are trials in which we randomly assigned some voters to get a robocall and others not and allow the campaign to give it its best shot with the best possible robocall. And then at the end of the election, we look at voter turnout records to see who voted. And in that particular case, the results were rather dismal. But not just in that case. I think that there have been more than 10 such large-scale experiments, and it’s hard to think of an instance in which they’ve performed well.</p> 
          493 <p><strong>Steven Cherry </strong>The two robocallers in Michigan allegedly made 12 000 calls into Detroit, which is majority black—85 000 calls in total to there and similar areas in other cities. According to a report in the Associated Press, calls falsely claimed that voting by mail would result in personal information going into databases that will be used by police to resolve old warrants, credit card companies to collect debts, and federal officials to track mandatory vaccines. It quoted the calls as saying, “Don’t be finessed into giving your private information to The Man. Beware of vote-by-mail.” You’ve studied plenty of affirmative campaigns, that is, attempts to increase voter participation. Do you have any thoughts about this negative robocalling?</p> 
          494 <p><strong>Donald Green </strong>Well, that certainly seems like a clear case of attempted voter suppression—to try to scare people away from voting. I don’t think I’ve ever seen anything like this. I haven’t heard the call. I’d be curious to know something about the voiceover that was used. But let’s suppose that it seemed credible. You know, the question is whether people take it seriously enough or whether they questioned the content, maybe talking to others in ways that undercut its effectiveness. But if robocalls seldom work, it’s probably because people just don’t notice them. Not sure whether this one would potentially work because it would get somebody to notice at any rate. We don’t know how effective it would be. I suspect not terribly effective, but probably effective enough to be concerning.</p> 
          495 <p><strong>Steven Cherry </strong>Yeah, it was noticed enough that complaints about it filtered up to the state attorney general, but that doesn’t give us any quantitative data.</p> 
          496 <p>For decades, campaigns have spent a lot of their money on television advertising. And it can influence strategy. To take just one example, there’s a debate among Democrats about whether their candidate should invest in Texas because there’s so many big media markets. It’s a very expensive state to contest. What does the experimental data tell us about television?</p> 
          497 <p><strong>Donald Green </strong>Experience on television is relatively rare. One that I’m most familiar with is one that actually I helped conduct with my three coauthors back when we were studying the Texans for Rick Perry campaign in 2006. We randomly assigned 18 of the 20 media markets in Texas to receive varying amounts of TV advertising, and various timings at which point it would be rolled out. And we conducted daily tracking polls to see the extent to which public opinion moved as ads rolled out in various media markets. And what we found was there was some effect of Rick Perry’s advertising campaign, but it subsided very quickly. Only a few days passed before it was essentially gone without a trace, which means that one can burn quite a lot of money for a relatively evanescent effect in terms of the campaign. I really don’t think that there’s much evidence that the very, very large amounts of money that are spent on television in the context of a presidential campaign have any lasting effect. And so it’s really an open question as to whether, say, the $300 million dollars that the Clinton campaign spent in 2016 would have been better spent least as well spent on the ground.</p> 
          498 <p><strong>Steven Cherry </strong>In contrast to war stories, you and your colleagues conduct true randomized experiments. Maybe you could say a little bit more about how hard that is to do in the middle of an election.</p> 
          499 <p>Yes, it’s a juggling act for sure. The idea is, if we wanted to study, for example, the effects of direct mail on voter turnout, one would randomly assign large lists of registered voters, some to get the mail, some to be left alone. And then we’d use the fact that voting is a public record in the United States—and a few other countries as well—to gauge voter turnout after the election is over. This is often unsatisfactory for campaigns. They want to know the answer ahead of time. But first, we know no good way of answering the question before people actually cast their ballots. And so this is something that’s been done in increasing numbers since 1998. And now hundreds of those trials have been done on everything ranging from radio, robocalls, TV, direct mail, phone calls, social media, etc, etc.</p> 
          500 <p><strong>Steven Cherry </strong>One thing you would expect campaign professionals to have data on is cost-effectiveness, but apparently they don’t. But you do. You’ve found, for example, that you can generate the same 200 votes with a quarter of a million robocalls, 38 000 mailers, or 2500 door-to-door conversations.</p> 
          501 <p><strong>Donald Green </strong>Yes, we try to not only gauge the effects of the intervention through randomized trials but also try to figure out what that amounts to in terms of dollars per vote. And these kinds of calculations are always going to be context-dependent because some campaigns are able to rely on inexpensive people power, to inspire volunteers in vast numbers. And so in some sense, the costs that we estimate could be greatly overstated for the kinds of boots-on-the-ground canvassing that are typical of presidential elections in battleground states. Nevertheless, I think that it is interesting to note that even with relatively cautious calculations, to the effect that people are getting $16 an hour for canvassing, canvassing still acquits itself rather well in terms of its comparisons to other campaign tactics.</p> 
          502 <p><strong>Steven Cherry </strong>Now that’s just for turnout, not votes for one candidate instead of another; a nonpartisan good-government group might be interested in turnout for its own sake, but a campaign wants a higher turnout of its own voters. How does it make that leap?</p> 
          503 <p><strong>Donald Green </strong>Well, typically what they do is rely on voter files—and augmented voter files, which is, say, voter files that had other information about people appended to them—in order to make an educated guess about which people on the voter file are likely to be supportive of their own campaign. So Biden supporters have been micro-targeted and so have Trump supporters and so on and so forth, based on their history of donating to campaigns or signing petitions or showing up in party primaries. And that makes the job of the campaign much easier because instead of trying to persuade people or win them over from the other side, they’re trying to bring a bigger army to the battlefield by building up enthusiasm and mobilizing their own core supporters. So the ideal for that kind of campaign is a person who is very strongly aligned with the candidate that is sponsoring the campaign but has a low propensity of voting. And so that that kind of person is really perfect for a mobilization campaign.</p> 
          504 <p>So that could also be done demographically. I mean, there are zip codes in Detroit that are 80 percent black.</p> 
          505 <p><strong>Donald Green </strong>Yes, there are lots of ways of doing this based on aggregates. No, you often don’t have to rely on aggregates because you typically have information about each person. But if you were to basically do it, say, precinct by precinct, you could use as proxies for the left—percentage-African-American—or proxies for the right demographics that are associated with Trump voting. So it’s possible to do it, but it’s probably not state of the art.</p> 
          506 <p><strong>Steven Cherry </strong>You mentioned door-to-door canvassing; it increases turnout but—perhaps counterintuitively—apparently, it doesn’t matter much whether it’s a close contest or a likely blowout, and if it doesn’t matter what the canvasser’s message is.</p> 
          507 <p><strong>Donald Green </strong>This is one of the most interesting things, actually about studying canvassing and other kinds of tactics experimentally. It appears that some of the most important communication at the door is nonverbal. You know, you show up at my door, and I wonder what you’re up to—are you trying to sell me something, trying to, you know, make your way in here? I figure, oh, actually you’re just having a pleasant conversation. You’re a person like me. You’re taking your time out to encourage me to vote. Well, that sounds okay. And I think that that message is probably the thing that sticks with people, perhaps more than the details of what you’re trying to say to me about the campaign or the particularities about why I should vote—should I vote because it’s my civic duty or should I vote because I need to stand up in solidarity with my community? Those kinds of nuances don’t seem to matter as much as we might suppose.</p> 
          508 <p><strong>Steven Cherry </strong>So it seems reminiscent of what the sociologists would call a Hawthorne effect.</p> 
          509 <p><strong>Donald Green </strong>Some of it is reminiscent of the Hawthorne effect. The Hawthorne effect is basically, we increase our productivity when we’re being watched. And so there’s some sense in which being monitored, being encouraged by another person makes us feel as though we’ve got to give a bit more effort. So there’s a bit of that. But I think partly what’s going on is voting is a social activity. And just as you’re more likely to go to a party if you were invited by a person as opposed to by e-mail. So too, you’re more likely to show up to vote if somebody makes an authentic, heartfelt appeal to you and encourages you to vote in-person or through something that’s very much like in-person. So it’s some gathering or some friend to friend communication as opposed to something impersonal, like you get a postcard.</p> 
          510 <p><strong>Steven Cherry </strong>So without looking into the details of the Biden campaign flip-flop on door-to-door canvassing, your hunch would be that they’re making the right move?</p> 
          511 <p><strong>Donald Green </strong>Yes, I think so. I mean, putting aside the other kinds of normative concerns about whether people are at risk if they get up and go out to canvass or they’re putting others at risk ... In terms of the raw politics of winning votes, it’s a good idea in part because in 2018, they were able to field an enormous army of very committed activists in many of the closely contested congressional elections and showed apparently very good, good results. And the tactic itself is so well tested that if they can do it with appropriate PPE and precautions, they could be quite effective.</p> 
          512 <p><strong>Steven Cherry </strong>In your research you found by contrast, door-hangers and yard signs—the way I spent that Saturday afternoon I described—have little or maybe even no utility.</p> 
          513 <p><strong>Donald Green </strong>Well, yard signs might have some utility to candidates, especially down-ballot candidates who are trying to increase their vote share. It doesn’t seem to have much of an effect on voter turnout. Maybe that’s because the election is already in full swing and everybody knows that there’s an election coming up—the yard sign isn’t going to convey any new information. But I do think the door hangers have some residual effect. They’re probably about as effective as a leaflet or a mailer, which is not very effective, but maybe a smidge better than zero.</p> 
          514 <p><strong>Steven Cherry </strong>You’re more positive on phone banks, albeit with some qualifiers.</p> 
          515 <p><strong>Donald Green </strong>Yes, I think that phone banking, especially authentic volunteer-staffed phone banking, can be rather effective. You know, I think that if you have an unhurried conversation with someone who is basically like-minded. They’re presumably targeted because they’re someone who shares more or less your political outlook and you bring them around to explain to them why it’s an important and historic election, giving them any guidance you can about when and how to vote. You can have an effect. It’s not an enormous effect. It’s something in the order of, say, three percentage points or about one additional vote for every 30 calls you complete. But it’s a substantial effect.</p> 
          516 <p>And if you are able to extract a commitment to vote from that person and you were to be so bold as to call them back on the day before the election to make sure that they’re making good on their pledge, then you can have an even bigger effect, in fact, a very large effect. So I do think it can be effective. I also think that perfunctory, hurried calls by telemarketing operations are rather ineffective for a number of reasons, but especially the lack of authenticity.</p> 
          517 <p><strong>Steven Cherry </strong>Let’s turn to social media, particularly Facebook. You described one rather pointless Facebook campaign that ended up costing $474 per vote. But your book also describes a very successful experiment in friend-to-friend communication on Facebook.</p> 
          518 <p><strong>Donald Green </strong>That’s right. We have a number of randomized trials suggesting that encouragements to vote via Facebook ads or other kinds of Facebook media that are mass-produced seem to be relatively limited in their effects. Perhaps the biggest, most intensive Facebook advertising campaign was its full-day banner ads that ran all day long—I think it was the 2010 election—and had precisely no effect, even though it was tested among 61 million people.</p> 
          519 <p>More effective on Facebook were ads that showed you whether your Facebook friends had claimed to vote. Now, that didn’t produce a huge harvest of votes, but it increased turnout by about a third of a percentage point. So better than nothing. The big effects you see on Facebook and elsewhere are where people are, in a personalized way, announcing the importance of the upcoming election and urging their Facebook friends—their own social networks—to vote.</p> 
          520 <p>And that seems to be rather effective and indeed is part of a larger literature that’s now coming to light, suggesting that even text messaging, though not a particularly personal form of communication, is quite effective when friends are texting other friends about the importance of registering and voting. Surprisingly effective, and that, I think, opens up the door to a wide array of different theories about what can be done to increase voter turnout. It seems as though friend-to-friend communication or neighbor-to-neighbor communication or communication among people who are coworkers or co-congregants ... that could be the key to raising turnout—not by not just one or two percentage points, but more like eight to 10.</p> 
          521 <p><strong>Steven Cherry </strong>On this continuum of personal versus impersonal, Facebook groups,—which are a new phenomenon—seem to lie somewhere in between. Some people are calling them “toxic echo chambers,” but they would seem to maybe be a godsend for political engagement.</p> 
          522 <p><strong>Donald Green </strong>I would think so, as long as the communication within the groups is authentic. If it’s if it’s automated, then probably not so much. But to the extent that the people in these groups have gotten to know each other or knew each other before they came into the group, then I think communication among them or between them could be quite compelling.</p> 
          523 <p><strong>Steven Cherry </strong>Yes. Although, of course, that person that you think you’re getting to know might be some employee in St. Petersburg, Russia, of the Internet Research Agency. Snapchat has been getting some attention these days in terms of political advertising. They’ve tried to be more transparent than Facebook, and they do some fact-checking on political advertising. Could it be a better platform for political ads or engagement?</p> 
          524 <p><strong>Donald Green </strong>I realize I just don’t know very much about the nuances of what they’re doing. I’m not sure that I have enough information to say.</p> 
          525 <p><strong>Steven Cherry </strong>Getting back to more analog activities, your book discusses events like rallies and processions, but I didn’t see anything about smaller coffee-klatch-style events where, say, you invite all your neighbors and friends to hear a local candidate speak. That would seem to combine the effectiveness of door-to-door canvassing with the Facebook friend-to-friend campaign. But maybe it’s hard to study experimentally.</p> 
          526 <p><strong>Donald Green </strong>That’s right. I would be very, very optimistic about the effects of those kinds of small gatherings. And it’s not that we are skeptical about their effects. It’s just, as you say, difficult to orchestrate a lot of experiments where people are basically opening their homes to friends. We need to talk to rope in more volunteers to bring in their friends experimentally.</p> 
          527 <p><strong>Steven Cherry </strong>The business model for some campaign professionals is to get paid relative to the amount of money that gets spent. Does that disincentivize the kind of person-to-person campaigning you generally favor?</p> 
          528 <p><strong>Donald Green </strong>Yes, I would say that one of the biggest limiting factors on person-to-person campaigning is that it’s very difficult for campaign consultants to make serious money off of it. And that goes double for the kind of serious money that is poured into campaigns in the final weeks. Huge amounts of money tend to be donated within the last three weeks of an election. And by that point, it’s very difficult to build the infrastructure necessary for large-scale canvassing or really any kind of retail-type politics. For that reason, the last-minute money tends to be dumped into digital ads and in television advertising—and in lots and lots of robocalls.</p> 
          529 <p><strong>Steven Cherry </strong>Don, as we record, this is less than a week after the first 2020 presidential debate and other events in the political news have maybe superseded the debate already. But I’m wondering if you have any thoughts about it in terms of getting out the vote. Many people, I have to say, myself included, found the debate disappointing. Do you think it’s possible for a debate to depress voter participation?</p> 
          530 <p><strong>Donald Green </strong>I think it’s possible. I think it’s rather unlikely to the extent that political science researchers have argued that negative campaigning depresses turnout, tends to depress turnout among independent voters, not so much among committed partisans who watched the debate and realize more than ever that their opponent is aligned with the forces of evil. For independent voters, they might say, “a plague on both your houses, I’m going to participate.” But I think that this particular election is one that is so intrinsically interesting that the usual way that independents feel about partisan competition probably doesn’t apply here.</p> 
          531 <p><strong>Steven Cherry </strong>On a lighter note, an upcoming podcast episode for me will be about video game culture. And it’ll be with a professor of communications who writes her own video games for her classes. Your hobby turns out to be designing board games. Are they oriented toward political science? Is there any overlap of these passions?</p> 
          532 <p><strong>Donald Green </strong>You know, it’s strange that they really don’t overlap at all. My interest in board games goes back to when I was a child. I’ve always been passionate about abstract board games like chess or go. And there was an accident that I started to design them myself. I did it actually when my fully-adult children were kids and we were playing with construction toys. And I began to see possibilities for games in those construction toys. And one thing led to another. And they were actually <a href="https://boardgamegeek.com/boardgame/450/octi">deployed to the world</a> and marketed. And now I think they’re kind of going the way of the dinosaur. But there’s still a few dinosaurs like me who enjoy playing on an actual physical board.</p> 
          533 <p><strong>Steven Cherry </strong>My girlfriend and I still play <a href="https://boardgamegeek.com/boardgame/917/rack-o">Rack-O</a>. So maybe this is not a completely lost cause.</p> 
          534 <p>Well Don, I think in the US, everyone’s thoughts will be far from the election until the counting stops. Opinions and loyalties differ. But the one thing I think we can all agree on is that participation is essential for the health of the body politic. On behalf of all voters, let me thank you for all that your book has done toward that end and for myself and my listeners, thank you for joining me today.</p> 
          535 <p><strong>Donald Green </strong>I very much appreciate it. Thanks.</p> 
          536 <p><strong>Steven Cherry </strong>We’ve been speaking with <a href="https://polisci.columbia.edu/content/donald-p-green">Donald Green</a>, a political scientist and co-author of <a href="https://www.brookings.edu/book/get-out-the-vote-2/"><em>Get Out the Vote</em></a><em>,</em> which takes a data-driven look at maximizing efforts to get out the vote.</p> 
          537 <p>This interview was recorded October 5th, 2020. Our thanks to Mike at Gotham Podcast Studio for audio engineering. Our <a href="https://www.youtube.com/watch?v=x6i8iQ1c0MM">music</a> is by <a href="https://freemusicarchive.org/music/Chad_Crouch">Chad Crouch</a>.</p> 
          538 <p>Radio Spectrum is brought to you by <em>IEEE Spectrum</em>, the member magazine of the Institute of Electrical and Electronic Engineers.</p> 
          539 <p>For Radio Spectrum, I’m <a href="mailto:metaphor@ieee.org">Steven Cherry</a>.</p> 
          540 <p><em>Note: Transcripts are created for the convenience of our readers and listeners. The authoritative record of IEEE Spectrum’s audio programming is the audio version.</em></p> 
          541 <p><em>We welcome your comments on Twitter (</em><a href="https://twitter.com/radiospectrum1">@RadioSpectrum1</a><em> and </em><a href="https://twitter.com/IEEESpectrum">@IEEESpectrum</a><em>) and </em><a href="https://www.facebook.com/IEEE.Spectrum">Facebook</a><em>.</em></p>]]></content:encoded>
          542       <dc:creator>Steven Cherry</dc:creator>
          543       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMTA3NQ.jpeg" />
          544       <media:content url="https://spectrum.ieee.org/image/MzcxMTA3NQ.jpeg" />
          545     </item>
          546     <item>
          547       <title>Programmable Filament Gives Even Simple 3D Printers Multi-Material Capabilities</title>
          548       <link>https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/programmable-filament-gives-even-simple-3d-printers-multimaterial-capabilities</link>
          549       <description>With a little bit of extra work but no additional hardware, your cheap 3D printer can make complex objects out of different kinds of filament</description>
          550       <category>consumer-electronics</category>
          551       <category>consumer-electronics/gadgets</category>
          552       <pubDate>Thu, 29 Oct 2020 20:03:00 GMT</pubDate>
          553       <guid>https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/programmable-filament-gives-even-simple-3d-printers-multimaterial-capabilities</guid>
          554       <content:encoded><![CDATA[<p>On the additive manufacturing spectrum, the majority of 3D printers are relatively simple, providing hobbyists with a way of conjuring arbitrary 3D objects out of long spools of polymer filament. If you want to make objects out of more than just that kind of filament, things start to get much more complicated, because you need a way of combining multiple different materials onto the print bed. There are a bunch of ways of doing this, but it’s not cheap, so most people without access to a corporate or research budget are stuck 3D printing with one kind of filament at a time.</p> 
          555 <p>At the <a href="https://uist.acm.org/uist2020/">ACM UIST Conference</a> last week, researchers presented <a href="http://www.jeeeunkim.com/papers/programmable-filament.pdf">a paper</a> that offers a way of giving even the simplest 3D printer the ability to print in as many materials as you need (or have the patience for) through a sort of printception—by first printing a filament out of different materials and then using that filament to print the multi-material object that you want.</p> 
          556 <!--nextpage--> 
          557 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/pNeKJnb--Dw?rel=0" width="620"></iframe></p> 
          558 <p>There are two steps to this process: filament creation&nbsp;and object printing. The actual object printing is the boring part—you just print your object like you normally would, except that when it’s done, there are different materials in all the right places, because those materials were programmed into the filament in advance, at exactly the locations and lengths they needed to be.</p> 
          559 <p>It might seem like using a multi-material filament to create a multi-material print doesn’t actually solve anything, since that multi-material filament has to be created by something that can print multiple materials, right? Right! And a simple 3D printer can totally do that, as long as you’re willing to change out filaments by hand. That’s really the trick here: you use a regular 3D printer to 3D print a complex multi-material filament that you then feed back into the printer to print your object.</p> 
          560 <p>It won’t surprise you that there’s a lot of computation involved in making this work, because the filament that you create (which the researchers describe as “programmable filament”) has to embody in its construction the location of every bit of material in the final object, and this can get very complicated, as it depends on both the geometry of the object and the path that the print head takes.</p> 
          561 <p>The filament for an object may involve many transitions from one material to another, even if the object itself is relatively simple. You can imagine printing a cup that’s half red and half blue, but because of the path of the print head, having the top be red and the bottom be blue involves a single transition in the filament, while having one side be red and the other side be blue could potentially involve two transitions with every single printed layer.</p> 
          562 <figure class="xlrg" role="img"> 
          563  <img alt="Illustration explaining the printing procedure of a programmable filament." src="/image/MzcxMTQyNg.jpeg"> 
          564  <figcaption class="hi-cap">
          565    Images:&nbsp;Programmable Filament Team 
          566  </figcaption> 
          567  <figcaption>
          568    Printing procedure of a filament: (a) Printing starts with one&nbsp;color, (b) it pauses upon completion of printing all segments, allowing the&nbsp;user to change the material. (c-d) The 3D printer prints the remaining&nbsp;segments avoiding collision with prior segments, (e) then prints stitches&nbsp;to join adjacent segments. 
          569  </figcaption> 
          570 </figure> 
          571 <p>The filament itself is printed in a spiral, one type of material at a time. Once every section of one material has been printed, you manually load in the next material, and the printer adds those bits onto the filament spiral in the right spots. You can repeat this process for as many materials as you need (the researchers have successfully printed with up to six), and once you’re done, the printer makes a final pass to stitch all the transition points between materials together. Then just lift the printed filament off of the print bed, reset the printer, feed in the filament you just created, and start the printer on creating your new object.</p> 
          572 <p>There is a bit of messiness that shows up in the print when one material transitions into another. If you’ve got different colored materials, they’ll blend a little bit, and for materials with different physical properties, well, who knows. You can either look at this as a bug, in which case you can modify the infill or wall density of the print so that transitions are less visible, or as a feature, in which case you can leverage this mixing to create deliberate color transitions or the mixing of structural properties by deliberately combining two or more materials into one piece of filament.&nbsp;</p> 
          573 <figure class="xlrg" role="img"> 
          574  <img alt="Example prints using the programmable filament system" src="/image/MzcxMTQyNw.jpeg"> 
          575  <figcaption class="hi-cap">
          576    Images:&nbsp;Programmable Filament Team 
          577  </figcaption> 
          578  <figcaption>
          579    Three different software designs allow users to choose various inputs and design parameters.&nbsp; 
          580  </figcaption> 
          581 </figure> 
          582 <p>The researchers note that their software pipeline and technique work with most (but not all) 3D printers, and that no matter what printer you’re using, careful calibration and monitoring is required. In future work, they’re hoping to optimize the process to make it as easy as possible to print custom filaments, and hopefully when they’ve got everything just right, they’ll make their software available for anyone who wants to make their simple 3D printer much, much more capable.</p>]]></content:encoded>
          583       <dc:creator>Evan Ackerman</dc:creator>
          584       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMTM5NQ.jpeg" />
          585       <media:content url="https://spectrum.ieee.org/image/MzcxMTM5NQ.jpeg" />
          586     </item>
          587     <item>
          588       <title>Squabbling Over the Waters of the River Nile</title>
          589       <link>https://spectrum.ieee.org/green-tech/conservation/squabbling-over-the-waters-of-the-river-nile</link>
          590       <description>Ethiopia has dammed a tributary, crimping the flow into Egypt, which for the first time in millennia is running out of water</description>
          591       <category>green-tech</category>
          592       <category>green-tech/conservation</category>
          593       <pubDate>Thu, 29 Oct 2020 19:00:00 GMT</pubDate>
          594       <guid>https://spectrum.ieee.org/green-tech/conservation/squabbling-over-the-waters-of-the-river-nile</guid>
          595       <content:encoded><![CDATA[<figure class="xlrg" role="img"> 
          596  <img alt="Construction machinery stands in the center of the dam wall at the site of the under-construction Grand Ethiopian Renaissance Dam in the Benishangul-Gumuz Region of Ethiopia." src="/image/MzcwNzk3Mg.jpeg"> 
          597  <div class="ai"> 
          598   <figcaption class="hi-cap">
          599     Photo: Zacharias Abubeker/Bloomberg/Getty Images 
          600   </figcaption> 
          601   <figcaption>
          602     Construction machinery stands in the center of the dam wall at the site of the Grand Ethiopian Renaissance Dam, under construction in the Benishangul-Gumuz Region of Ethiopia, on Tuesday, May 21, 2019. 
          603   </figcaption> 
          604  </div> 
          605 </figure> 
          606 <p><strong><span>The completion of the</span> </strong>Grand Ethiopian Renaissance Dam on the Blue Nile has made Egypt fear for its very existence. To understand why requires the appreciation of basic water-flow and water-use numbers in the region.</p> 
          607 <p>The Blue Nile flows from Ethiopia’s Lake Tana, carrying <a href="http://www.fao.org/3/an530e/an530e.pdf">48.3 cubic</a> kilometers of water a year. In Khartoum, Sudan, it merges with the White Nile, which adds 26 km<sup>3</sup>/year. The Atbara adds 11.1 km<sup>3</sup>. These rivers, coming out of Ethiopia, together provide about <a href="http://www.fao.org/3/an530e/an530e.pdf">70 percent</a> of the Nile’s flow into Egypt.</p> 
          608 <p>The <a href="http://gis.nacse.org/tfdd/tfdddocs/92ENG.pdf">Anglo-Egyptian treaty of 1929</a> secured for Egypt the rights to 48 km<sup>3</sup> of water; the <a href="http://ewp.cedare.org/wp-content/uploads/2018/09/Agreement-with-Annexes-between-the-United-Arab-Republic-and-the-Republic-of-Sudan-for-the-full-utilization-of-the-Nile-waters.-Signed-at-Cairo-on-8-November-1959.pdf">1959 treaty update</a> raised the amount to 55.5 km<sup>3</sup>, with Sudan getting 18.5 km<sup>3</sup>. After accounting for the intervening water losses in the annually flooded Sudd swamps of South Sudan, this allocation left all the other states along the Nile tributaries with no claims to the water at all.</p> 
          609 <p>Egypt still upholds this allocation, but in 2009 Ethiopia began a de facto dismantling of the arrangement with the completion of a <a href="https://www.stantec.com/en/projects/united-states-projects/t/tekeze-hydropower">dam on the Tekezé River</a>, a tributary to the Atbara. At 188 meters high, it is the tallest African arch dam (shaped to resist water pressure), although it has an installed hydropower capacity of just 300 megawatts and a relatively small reservoir, holding 9 km<sup>3</sup>. The next Ethiopian action, the <a href="https://en.wikipedia.org/wiki/Beles_Hydroelectric_Power_Plant">Tana Beles hydro project</a> (460 MW), began to generate electricity in 2010 and has no storage. Instead, it gets its water straight from Lake Tana and discharges it into the Beles River, a tributary of the Blue Nile. By themselves, these two projects would cause little worry to Egypt, were its dependence on the Nile’s water not becoming precarious.</p> 
          610 <aside class="inlay xlrg"> 
          611  <h3 class="sb-hed"><span style="display:inline !important">How Very Diverting</span></h3> 
          612  <p>Egypt drinks from the River Nile, and the country stands to lose should anything cut the flow from its tributaries, which originate deep within Africa. Just such a threat now looms following the inauguration of the Grand Ethiopian Renaissance Dam.</p> 
          613  <figure class="xlrg" role="img"> 
          614   <a class="zoom" href="/image/MzcwODA3Nw.jpeg" rel="lightbox"><img alt="Illustration of the Grand Ethiopian Renaissance Dam." src="/image/MzcwODA3Nw.jpeg"><span class="magnifier">&nbsp;</span></a> 
          615   <div class="ai"> 
          616    <figcaption class="hi-cap">
          617      Illustration: Francesco Muzi/StoryTK 
          618    </figcaption> 
          619   </div> 
          620  </figure> 
          621 </aside> 
          622 <p>In 1959 Egypt’s population was about 26 million, by 2020 it had nearly quadrupled to just over 100 million, and it is now increasing by a little under 2 million a year. This growth has reduced the country’s per capita annual supply of fresh water to only 550 cubic meters, less than half the U.S. rate. Should the population reach its projected size of 160 million in 2050, this rate might fall below 400 cubic meters.</p> 
          623 <p>The challenge is greatly increased by the new Renaissance dam on the Blue Nile, near Ethiopia’s border with Sudan. The dam, completed in June 2020, has an installed hydropower capacity of 6.45 gigawatts and a reservoir designed to hold 74 km<sup>3</sup>. The rainy season of 2020 has already put in 5 km<sup>3</sup> of water.</p> 
          624 <p>Filling the rest of the reservoir in five years would cut the annual flow out of Ethiopia by 30 percent and thus the flow into Egypt by just over 20 percent (that is, 30 percent of 70 percent). This would deprive Egypt of one-fifth of its water, and even after the reservoir has been filled, retention of flows during dry years would continue to limit the downstream supply.</p> 
          625 <p>What Egypt sees as a <a href="https://www.bbc.com/news/world-africa-53573154">mortal challenge</a>, Ethiopia considers to be its inalienable right: That country numbers 115 million people, growing by 2.6 million a year, and it has a per capita gross domestic product less than 20 percent of the Egyptian average. Should Ethiopia forever remain hopelessly impoverished to support a better-off country?</p> 
          626 <p>Partial solutions are possible, but none is easy or easily affordable. Egypt’s own Aswan High Dam (2.1 GW), completed in 1970, impounds 132 km<sup>3</sup> but, situated in one of the world’s hottest regions, it loses annually up to 15 km<sup>3</sup> to evaporation. Storing this water in a less extreme environment (the best location would be in South Sudan) would reduce the loss but deprive Egypt of 2.1 GW of installed capacity and of water control for its deltaic irrigation. Channeling the White Nile through South Sudan, around the Sudd swamps, would cut the region’s huge evaporation losses, but ever since gaining its independence in 2011, that nation has experienced endless civil war, tribal fighting, and chronic political instability.</p>]]></content:encoded>
          627       <dc:creator>Vaclav Smil</dc:creator>
          628       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNzk0Mg.jpeg" />
          629       <media:content url="https://spectrum.ieee.org/image/MzcwNzk0Mg.jpeg" />
          630     </item>
          631     <item>
          632       <title>The 11 Greatest Vacuum Tubes You’ve Never Heard Of</title>
          633       <link>https://spectrum.ieee.org/tech-history/space-age/the-11-greatest-vacuum-tubes-youve-never-heard-of</link>
          634       <description>These vacuum devices stood guard during the Cold War, advanced particle physics, treated cancer patients, and made the Beatles sound good</description>
          635       <category>tech-history</category>
          636       <category>tech-history/space-age</category>
          637       <pubDate>Thu, 29 Oct 2020 15:00:00 GMT</pubDate>
          638       <guid>https://spectrum.ieee.org/tech-history/space-age/the-11-greatest-vacuum-tubes-youve-never-heard-of</guid>
          639       <content:encoded><![CDATA[<style type="text/css">hr {
          640     box-sizing: content-box;
          641     height: 0;
          642     overflow: visible;
          643     margin-top: 1rem;
          644     margin-bottom: 1rem;
          645     border: 0;
          646     border-top: 5px solid rgba(0,0,0,.3);
          647 }
          648 </style> 
          649 <figure class="xlrg" role="img"> 
          650  <img alt="From left: Thales multi-beam klystron, Ubitron, Coaxitron" src="/image/MzcxMDc3MA.jpeg"> 
          651  <figcaption class="hi-cap">
          652    From left, Thales; Robert Phillips; RCA 
          653  </figcaption> 
          654 </figure> 
          655 <p><strong>In an age propped up by</strong> quintillions of solid-state devices, should you even care about vacuum tubes? You definitely should! For richness, drama, and sheer brilliance, few technological timelines can match the 116-year (and counting) history of the vacuum tube. To prove it, I’ve assembled a list of vacuum devices that over the past 60 or 70 years inarguably changed the world.</p> 
          656 <p></p> 
          657 <p></p> 
          658 <p>And just for good measure, you’ll also find here a few tubes that are too unique, cool, or weird to languish in obscurity.</p> 
          659 <p></p> 
          660 <p>Of course, anytime anyone offers up a list of <em>anything</em>—the&nbsp;comfiest trail-running shoes, the most authentic Italian restaurants in Cleveland, movies that are better than the book they’re based on—someone else is bound to weigh in and either object or amplify. So, to state the obvious: This is <em>my</em> list of vacuum tubes. But I’d love to read yours. Feel free to add it in the comments section at the end of this article.</p> 
          661 <p></p> 
          662 <p>My list isn’t meant to be comprehensive. Here you’ll find no gas-filled glassware like <a href="/tech-history/dawn-of-electronics/the-nixie-tube-story-the-neon-display-tech-that-engineers-cant-quit">Nixie tubes</a> or thyratrons, no “uber&nbsp;high” pulsed-power microwave devices, no cathode-ray display tubes. I intentionally left out well-known tubes, such as satellite traveling-wave tubes and microwave-oven magnetrons. And I’ve pretty much stuck with <a href="http://nvlpubs.nist.gov/nistpubs/ScientificPapers/nbsscientificpaper449vol18p335_A2b.pdf">radio-frequency tubes</a>, so I’m ignoring the vast panoply of audio-frequency tubes—with one notable exception.</p> 
          663 <p></p> 
          664 <p>But even within the parameters I’ve chosen, there are so many amazing devices that it was rather hard to pick just eleven of them. So here’s my take, in no particular order, on some tubes that made a difference.</p> 
          665 <p></p> 
          666 <hr> 
          667 <h2><strong>Medical Magnetron</strong></h2> 
          668 <p></p> 
          669 <div> 
          670  <figure class="rt med" role="img"> 
          671   <img alt="Photo of Medical Mangnetron" src="/image/MzcwOTcyMA.jpeg"> 
          672   <div class="ai"> 
          673    <figcaption class="hi-cap">
          674      Photo: Teledyne ev 
          675    </figcaption> 
          676   </div> 
          677  </figure> 
          678  <p><strong>When it comes</strong> to efficiently generating coherent radio-frequency power in a compact package, you can’t beat the magnetron.</p> 
          679  <p>The magnetron first rose to glory in World War II, <a href="/tech-history/dawn-of-electronics/from-world-war-ii-radar-to-microwave-popcorn-the-cavity-magnetron-was-there">to power British radar</a>. While the magnetron’s use in radar began to wane in the 1970s, the tube found new life in industrial, scientific, and medical applications, which continues today.</p> 
          680  <p>It is for this last use that the medical magnetron shines. In a linear accelerator, it creates a high-energy electron beam. When electrons in the beam are deflected by the nuclei in a target—consisting of a material having a high atomic number, such as tungsten—copious X-rays are produced, which can then be directed to kill cancer cells in tumors. The first clinical accelerator for radiotherapy was installed at London’s Hammersmith Hospital in 1952. A 2-megawatt magnetron powered the 3-meter-long accelerator.</p> 
          681  <p>High-power magnetrons continue to be developed to meet the demands of radiation oncology. The medical magnetron shown here, manufactured by <a href="https://www.teledyne-e2v.com/">e2v</a> Technologies (now Teledyne e2v), generates a peak power of 2.6 MW, with an average power of 3 kilowatts and an efficiency of more than 50 percent. Just 37 centimeters long and weighing about 8 kilograms, it’s small and light enough to fit the rotating arm of a radiotherapy machine.</p> 
          682 </div> 
          683 <hr> 
          684 <div> 
          685  <h2><strong>Gyrotron</strong></h2> 
          686 </div> 
          687 <div> 
          688  <figure class="xlrg" role="img"> 
          689   <img alt="Photo of Gyrotron" src="/image/MzcwOTcyMw.jpeg"> 
          690   <div class="ai"> 
          691    <figcaption class="hi-cap">
          692      Photo: Nuclear Fusion/IAEA 
          693    </figcaption> 
          694   </div> 
          695  </figure> 
          696  <p><strong>Conceived in the 1960s </strong>in the Soviet Union, the gyrotron is a high-power vacuum device used primarily for heating plasmas in nuclear-fusion experiments, such as <a href="https://www.iter.org/">ITER</a>, now under construction in southern France. These experimental reactors can require temperatures of up to 150 million °C.</p> 
          697  <p>So how does a megawatt-class gyrotron work? The name provides a clue: It uses beams of energetic electrons rotating or gyrating in a strong magnetic field inside a cavity. (We tube folks love our <em>-trons</em> and <em>-trodes.</em>) The interaction between the gyrating electrons and the cavity’s electromagnetic field generates high-frequency radio waves, which are directed into the plasma. The high-frequency waves accelerate the electrons within the plasma, heating the plasma in the process.</p> 
          698  <p>A tube that produces 1 MW of average power is not going to be small. Fusion gyrotrons typically stand around 2 to 2.5 meters tall and weigh around a metric ton, including a 6- or 7-tesla superconducting magnet.</p> 
          699  <p>In addition to heating fusion plasmas, gyrotrons are used in material processing and nuclear magnetic resonance spectroscopy. They have also been explored for nonlethal crowd control, in the U.S. military’s <a href="https://jnlwp.defense.gov/About/Frequently-Asked-Questions/Active-Denial-System-FAQs/">Active Denial System</a>. This system projects a relatively wide millimeter-wave beam, perhaps a meter and a half in diameter. The&nbsp;beam is designed to heat the surface of a person’s skin, creating a burning sensation but without penetrating into or damaging the tissue below.</p> 
          700  <hr> 
          701 </div> 
          702 <div> 
          703  <h2><strong>Mini Traveling-Wave Tube</strong></h2> 
          704  <figure class="xlrg" role="img"> 
          705   <img alt="Photo of Mini Traveling-Wave Tube" src="/image/MzcwOTc1Mw.jpeg"> 
          706   <div class="ai"> 
          707    <figcaption class="hi-cap">
          708      Photo: L3Harris Electron Devices 
          709    </figcaption> 
          710   </div> 
          711  </figure> 
          712  <p><strong>As its name suggests,</strong> a traveling-wave tube (TWT) amplifies signals through the interaction between an electric field of a&nbsp;traveling, or propagating, electromagnetic wave in a circuit and a streaming electron beam. [For a more detailed description of how a TWT works, see “<a href="/semiconductors/devices/the-quest-for-the-ultimate-vacuum-tube">The Quest for the Ultimate Vacuum Tube</a>,” <em>IEEE Spectrum</em>, December 2015.]</p> 
          713  <p>Most TWTs of the 20th century were designed for extremely high power gain, with amplification ratios of 100,000 or more. But you don’t always need that much gain. Enter the mini TWT, shown here in an example from <a href="https://www2.l3t.com/edd/">L3Harris Electron Devices</a>. With a gain of around 1,000 (or 30 decibels), a mini TWT is meant for applications where you need output power in the 40- to 200-watt range, and where small size and lower voltage are desirable. A 40-W mini TWT operating at 14&nbsp;gigahertz, for example, fits in the palm of your hand and weighs less than half a kilogram.</p> 
          714  <p>As it turns out, military services have a great need for mini TWTs. Soon after their introduction in the 1980s, mini TWTs were adopted in electronic warfare systems on planes and ships for protection against radar-guided missiles. In the early 1990s, device designers began integrating mini TWTs with a compact high-voltage power supply to energize the device and a solid-state amplifier to drive it. The combination created what is known as a <a href="https://ieeexplore.ieee.org/document/757252">microwave power module, or MPM</a>. Due to their small size, low weight, and high efficiency, MPM amplifiers found immediate use in radar and communications transmitters aboard military drones, such as the Predator and Global Hawk, as well as in electronic countermeasures.</p> 
          715  <hr> 
          716 </div> 
          717 <div> 
          718  <h2><strong>Accelerator Klystron</strong></h2> 
          719 </div> 
          720 <div> 
          721  <figure class="xlrg" role="img"> 
          722   <img alt="Photo of Accelerator Klystron" src="/image/MzcwOTc2NA.jpeg"> 
          723   <div class="ai"> 
          724    <figcaption class="hi-cap">
          725      Photo: Archives and History Office/SLAC National Accelerator Laboratory 
          726    </figcaption> 
          727   </div> 
          728  </figure> 
          729  <p><strong>The klystron helped</strong> usher in the era of big science in high-energy physics. Klystrons convert the kinetic energy of an electron beam into radio-frequency energy. The device has much greater output power than does a traveling-wave tube or a magnetron. The brothers Russell and Sigurd Varian <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;arnumber=6368703">invented the klystron</a> in the 1930s and, with others, founded Varian Associates to market it. These days, Varian’s tube business lives on at <a href="https://www.cpii.com/">Communications and Power Industries</a>.</p> 
          730  <p>Inside a klystron, electrons emitted by a cathode accelerate toward an anode to form an electron beam. A magnetic field keeps the beam from expanding as it travels through an aperture in the anode to a beam collector. In between the anode and collector are hollow structures called cavity resonators. A high-frequency signal is applied to the resonator nearest the cathode, setting up an electromagnetic field inside the cavity. That field modulates the electron beam as it passes through the resonator, causing the speed of the electrons to vary and the electrons to bunch as they move toward the other cavity resonators downstream. Most of the electrons decelerate as they traverse the final resonator, which oscillates at high power. The result is an output signal that is much greater than the input signal.</p> 
          731  <p>In the 1960s, engineers developed a klystron to serve as the RF source for a new 3.2-kilometer <a href="https://www6.slac.stanford.edu/about/slac-history">linear particle accelerator</a> being built at Stanford University. Operating at 2.856 gigahertz and using a 250-kilovolt electron beam, the SLAC klystron produced a peak power of 24 MW. More than 240 of them were needed to attain particle energies of up to 50 billion electron volts.</p> 
          732  <p>The <a href="https://www-group.slac.stanford.edu/kly/default2.htm">SLAC klystrons</a> paved the way for the widespread use of vacuum tubes as RF sources for advanced particle physics and X-ray light-source facilities. A 65-MW version of the SLAC klystron is still in production. Klystrons are also used for cargo screening, food sterilization, and radiation oncology.</p> 
          733  <hr> 
          734 </div> 
          735 <div> 
          736  <h2><strong>Ring-Bar Traveling-Wave Tube</strong></h2> 
          737  <figure class="rt med" role="img"> 
          738   <img alt="Photo of Ring-Bar Traveling-Wave Tube" src="/image/MzcxMDQ3NQ.jpeg"> 
          739   <div class="ai"> 
          740    <figcaption class="hi-cap">
          741      Photo: L3Harris Electron Devices 
          742    </figcaption> 
          743   </div> 
          744  </figure> 
          745  <p><strong>One Cold War tube</strong> that is still going strong is the huge ring-bar traveling-wave tube. This high-power tube stands over 3 meters from cathode to collector, making it the world’s largest TWT. There are 128 ring-bar TWTs providing the radio-frequency oomph for an exceedingly powerful phased-array radar at the Cavalier Air Force Station in North Dakota. Called the <a href="https://www.afspc.af.mil/About-Us/Fact-Sheets/Display/Article/1126406/perimeter-acquisition-radar-attack-characterization-system/">Perimeter Acquisition Radar Attack Characterization System (PARCS)</a>, this 440-megahertz radar looks for ballistic missiles launched toward North America. It also monitors space launches and orbiting objects as part of the <a href="https://en.wikipedia.org/wiki/United_States_Space_Surveillance_Network">Space Surveillance Network</a>. Built by GE in 1972, PARCS tracks more than half of all Earth-orbiting objects, and it’s said to be able to identify a basketball-size object at a range of 2,000 miles (3,218 km).</p> 
          746  <p>An even higher-frequency version of the ring-bar tube is used in a phased-array radar on remote Shemya Island, about 1,900 km off the coast of Alaska. Known as <a href="https://www.afspc.af.mil/About-Us/Fact-Sheets/Display/Article/1126403/cobra-dane-radar/">Cobra Dane</a>, the radar monitors non-U.S. ballistic missile launches. It also collects surveillance data on space launches and satellites in low Earth orbit.</p> 
          747  <p>The circuit used in this behemoth is known as a ring bar, which consists of circular rings connected by alternating strips, or bars, repeated along its length. This setup provides a higher field intensity across the tube’s electron beam than does a garden-variety TWT, in which the radio-frequency waves propagate along a helix-shaped wire. The ring-bar tube’s higher field intensity results in higher power gain and good efficiency. The tube shown here was developed by Raytheon in the early 1970s; it is now manufactured by L3Harris Electron Devices.</p> 
          748  <hr> 
          749 </div> 
          750 <div> 
          751  <h2><strong>Ubitron</strong></h2> 
          752  <figure class="lt med" role="img"> 
          753   <img alt="Photo of a man and the Ubitron" src="/image/MzcwOTY5Ng.jpeg"> 
          754   <div class="ai"> 
          755    <figcaption class="hi-cap">
          756      Photo: Robert Phillips 
          757    </figcaption> 
          758   </div> 
          759  </figure> 
          760  <p><strong>Fifteen years before</strong> the term “free-electron laser” was coined, there was a vacuum tube that worked on the same basic principle—<a href="https://ieeexplore.ieee.org/document/1472793">the ubitron</a>, which sort of stands for “undulating beam interaction.”</p> 
          761  <p>The 1957 invention of the ubitron came about by accident. Robert&nbsp;Phillips, an engineer at the General Electric Microwave Lab in&nbsp;Palo Alto, Calif., was trying to explain why one of the lab’s traveling-wave tubes oscillated and another didn’t. Comparing the two tubes, he noticed variations in their magnetic focusing, which caused the beam in one tube to wiggle. He figured that this undulation could result in a periodic interaction with an electromagnetic wave in a waveguide. That, in turn, could be useful for creating exceedingly high levels of peak radio-frequency power. Thus, the ubitron was born.</p> 
          762  <p>From 1957 to 1964, Phillips and colleagues built and tested a variety of ubitrons. The 1963 photo shown here is of GE colleague Charles Enderby holding a ubitron without its wiggler magnet. Operating at 70,000 volts, this tube produced a peak power of 150 kW at 54 GHz, a record power level that stood for well over a decade. But the U.S. Army, which funded the ubitron work, halted R&amp;D in 1964 because there were no antennas or waveguides that could handle power levels that high.</p> 
          763  <p>Today’s free-electron lasers employ the same basic principle as the ubitron. In fact, in recognition of his pioneering work on the ubitron, Phillips received the Free-Electron Laser Prize in 1992. The&nbsp;FELs now installed in the large light and X-ray sources at particle accelerators produce powerful electromagnetic radiation, which is used to explore the dynamics of chemical bonds, to understand photosynthesis, to analyze how drugs bind with targets, and even to create warm, dense matter to study how gas planets form.</p> 
          764  <hr> 
          765 </div> 
          766 <div> 
          767  <h2><strong>Carcinotron</strong></h2> 
          768  <figure class="rt med-lrg" role="img"> 
          769   <img alt="Photo of the Carcinotron" src="/image/MzcwOTY5NA.jpeg"> 
          770   <div class="ai"> 
          771    <figcaption class="hi-cap">
          772      Photo: CSF 
          773    </figcaption> 
          774   </div> 
          775  </figure> 
          776  <p><strong>The French tube</strong> called the carcinotron is another fascinating example born of the Cold War. Related to the magnetron, it was conceived by Bernard Epsztein in 1951 at Compagnie Générale de Télégraphie Sans Fil (CSF, now part of Thales).</p> 
          777  <p>Like the ubitron, the carcinotron grew out of an attempt to resolve an oscillation problem on a conventional tube. In this case, the source of the oscillation was traced to a radio-frequency circuit’s power flowing backward, in the opposite direction of the tube’s electron beam. Epsztein discovered that the oscillation frequency could be varied with voltage, which led to a patent for a voltage-tunable <a href="https://patentimages.storage.googleapis.com/18/94/64/e00b08eda2b809/US2919375.pdf">“backward wave” tube</a>.</p> 
          778  <p>For about 20 years, electronic jammers in the United States and Europe employed carcinotrons as their source of RF power. The tube shown here was one of the first manufactured by CSF in 1952. It delivered 200 W of RF power in the S band, which extends from 2 to 4 GHz.</p> 
          779  <p>Considering the level of power they can handle, carcinotrons are fairly compact. Including its permanent focusing magnet, a 500-W model weighs just 8 kg and measures 24 by 17 by 15 cm, a shade smaller than a shoebox.</p> 
          780  <p>And the strange name? <a href="https://vacuumelectronics.org/eds_members/philippe_thouvenin.html">Philippe Thouvenin</a>, a vacuum electronics scientist at Thales Electron Devices, told me that it comes from a Greek word, <em>karkunos</em>, which means crayfish. And crayfish, of course, swim backwards.</p> 
          781 </div> 
          782 <hr> 
          783 <h2><strong><span>Dual-Mode Traveling-Wave Tube </span></strong></h2> 
          784 <figure class="xlrg" role="img"> 
          785  <img alt="Photo of Dual-Mode Traveling-Wave Tube" src="/image/MzcwOTY4NA.jpeg"> 
          786  <div class="ai"> 
          787   <figcaption class="hi-cap">
          788     Photo: Northrop Grumman 
          789   </figcaption> 
          790  </div> 
          791 </figure> 
          792 <p><span><strong>The dual-mode TWT </strong>was an oddball microwave tube developed in the United States in the 1970s and ’80s for electronic countermeasures against radar. Capable of both low-power continuous-wave and high-power pulsed operation, this tube followed the old adage that two is better than one: It had two beams, two circuits, two electron guns, two focusing magnets, and two collectors, all enclosed in a single vacuum envelope. </span></p> 
          793 <p><span>The tube’s main selling point was that it broadened the uses of a given application—a countermeasure system, for example, could operate in both continuous-wave and pulsed-power modes but with a single transmitter and a simple antenna feed. A control grid in the electron gun in the shorter, pulsed-power section could quickly switch the tube from pulsed to continuous wave, or vice versa. Talk about packing a lot of capability into a small package. Of course, if the vacuum leaked, you’d lose both tube functions. </span></p> 
          794 <p><span>The tube shown here was developed by Raytheon’s Power Tube Division<span class="MsoHyperlink">, which was acquired by Litton Electron Devices in 1993. Raytheon/</span>Litton as well as Northrop Grumman manufactured the dual-mode TWT, but it was notoriously hard to produce in volume and was discontinued in the early 2000s. </span></p> 
          795 <hr> 
          796 <h2><strong><span>Multi-Beam Klystron </span></strong></h2> 
          797 <figure class="rt med" role="img"> 
          798  <img alt="Photo of Multi-Beam Klystron " src="/image/MzcwOTY1NA.jpeg"> 
          799  <div class="ai"> 
          800   <figcaption class="hi-cap">
          801     Photo: Thales 
          802   </figcaption> 
          803  </div> 
          804 </figure> 
          805 <p><span><strong>Power, as many</strong> of us learned as youngsters, equals voltage times current. To get more power out of a vacuum tube, you can increase the voltage of the tube’s electron beam, but that calls for a bigger tube and a more complex power supply. Or you can raise the beam’s current, but that can be problematic too. For that, you need to ensure the device can support the higher current and that the required magnetic field can transport the electron beam safely through the tube’s circuit—that is, the part of the tube that interacts with the electron beam. </span></p> 
          806 <p><span>Adding to the challenge, a tube’s efficiency generally falls as the beam’s current rises because the bunching of the electrons required for power conversion suffers.</span></p> 
          807 <p><span>All these caveats apply if you’re talking about a conventional vacuum tube with a single electron beam and a single circuit. But what if you employ multiple beams, originating from multiple cathodes and traveling through a common circuit? Even if the individual beam currents are moderate, the total current will be high, while the device’s overall efficiency is unaffected.</span></p> 
          808 <p><span>Such a multiple-beam device was studied in the 1960s in the United States, the Soviet Union, and elsewhere. The U.S. work petered out, but activity in the USSR continued, leading to the successful deployment of the multi-beam klystron, or MBK. The Soviets fielded many of these tubes for radar and other uses. </span></p> 
          809 <p><span>A modern example of an MBK is shown above, produced in 2011 by the French firm Thomson Tubes Electroniques (now part of </span><a href="https://www.thalesgroup.com/en"><span>Thales</span></a><span>). This MBK was developed for the </span><a href="https://www.desy.de/index_eng.html"><span>German Electron Synchrotron facility (DESY)</span></a><span>. A later version is used at the </span><a href="https://www.xfel.eu/"><span>European X-Ray Free Electron Laser facility</span></a><span>. The tube has seven beams providing a total current of 137 amperes, with a peak power of 10 MW and average power of 150 kW; its efficiency is greater than 63 percent. By contrast, a single-beam klystron developed by Thomson provides 5 MW peak and 100 kW average power, with an efficiency of 40 percent. So, in terms of its amplification capability, one MBK is equivalent to two conventional klystrons.</span></p> 
          810 <hr> 
          811 <div> 
          812  <h2><strong>Coaxitron</strong></h2> 
          813 </div> 
          814 <div> 
          815  <figure class="lt med" role="img"> 
          816   <img alt="Photo of a Coaxitron" src="/image/MzcwOTY4Ng.jpeg"> 
          817   <div class="ai"> 
          818    <figcaption class="hi-cap">
          819      Photo: RCA 
          820    </figcaption> 
          821   </div> 
          822  </figure> 
          823  <p><strong>All the tubes </strong>I’ve described so far are what specialists call beam-wave devices (or stream-wave in the case of the magnetron). But before those devices came along, tubes had grids, which are transparent screenlike metal electrodes inserted between the tube’s cathode and anode to control or modulate the flow of electrons. Depending on how many grids the tube has, it is called a diode (no grids), a triode (one grid), a tetrode (two grids), and so on. Low-power tubes were referred to as “receiving tubes,” because they were typically used in radio receivers, or as switches. (Here I should note that what I’ve been referring to as a “tube” is known to the British as a “valve.”)</p> 
          824  <p>There were, of course, <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;arnumber=1450965">higher-power grid tubes</a>. Transmitting tubes were used in—you guessed it—radio transmitters. Later on, high-power grid tubes found their way into a wide array of interesting industrial, scientific, and military applications.</p> 
          825  <p>Triodes and higher-order grid tubes all included a cathode, a current-control grid, and an anode or collector (or plate). Most of these tubes were cylindrical, with a central cathode, usually a filament, surrounded by electrodes.</p> 
          826  <p>The coaxitron, developed by RCA beginning in the 1960s, is a unique permutation of the cylindrical design. The electrons flow radially from the cylindrical coaxial cathode to the anode. But rather than having a single electron emitter, the coaxitron’s cathode is segmented along its circumference, with numerous heated filaments serving as the electron source. Each filament forms its own little beamlet of electrons. Because the beamlet flows radially to the anode, no magnetic field (or magnet) is required to confine the electrons. The coaxitron is thus very compact, considering its remarkable power level of around a megawatt.</p> 
          827  <p>A 1-MW, 425-MHz coaxitron weighed 130 pounds (59 kg) and stood 24 inches (61 cm) tall. While the gain was modest (10 to 15 dB), it was still a tour de force as a compact ultrahigh-frequency power booster. RCA envisioned the coaxitron as a source for driving RF accelerators, but it ultimately found a home in high-power UHF radar. Although coaxitrons were recently overtaken by solid-state devices, some are still in service in legacy radar systems.</p> 
          828  <hr> 
          829  <div> 
          830   <h2><strong>Telefunken Audio Tube</strong></h2> 
          831  </div> 
          832  <div> 
          833   <figure class="xlrg" role="img"> 
          834    <img alt="Photo of Telefunken Audio Tube" src="/image/MzcwOTY1Mw.jpeg"> 
          835    <div class="ai"> 
          836     <figcaption class="hi-cap">
          837       Photo: Thump/Soundgas 
          838     </figcaption> 
          839    </div> 
          840   </figure> 
          841   <p><strong>An important conventional</strong> tube with grids resides at the opposite end of the power/frequency spectrum from megawatt beasts like the klystron and the gyrotron. Revered by audio engineers and recording artists, the Telefunken VF14M was employed as an amplifier in the legendary <a href="https://vintageking.com/neumann-u47-u48-microphone">Neumann U47 and U48 microphones</a> favored by Frank Sinatra and by the Beatles’ producer Sir George Martin. Fun fact: There’s a Neumann U47 microphone on display <a href="https://abbeyroadinstitute.co.uk/blog/microphone-cabinet-neumann-u47/">at the Abbey Road Studio</a> in London. The “M” in the VF14M tube designation indicates it’s suitable for microphone use and was only awarded to tubes that passed screening at Neumann.</p> 
          842   <p>The VF14 is a pentode, meaning it has five electrodes, including three grids. When used in a microphone, however, it operates as a triode, with two of its grids strapped together and connected to the anode. This was done to exploit the supposedly superior sonic qualities of a triode. The VF14’s heater circuit, which warms the cathode so that it emits electrons, runs at 55 V. That voltage was chosen so that two tubes could be wired in series across a 110-V main to reduce power-supply costs, which was important in postwar Germany.</p> 
          843   <p>Nowadays, you can buy a solid-state replacement for the VF14M that even simulates the tube’s 55-V heater circuit. But can it replicate that warm, lovely tube sound? On that one, audio snobs will never agree.</p> 
          844   <p><em>This article appears in the November 2020 print issue as “The 9 Greatest Vacuum Tubes You’ve Never Heard Of.”</em></p> 
          845   <aside class="inlay xlrg"> 
          846    <h3 class="sb-hed"><span style="display:inline !important">Life of a Tube Guy</span></h3> 
          847    <figure class="rt med" role="img"> 
          848     <img alt="Photo of Carter M. Armstrong" src="/image/MzcwOTMwMA.jpeg"> 
          849     <div class="ai"> 
          850      <figcaption class="hi-cap">
          851        Photo: Michael Martin 
          852      </figcaption> 
          853     </div> 
          854    </figure> 
          855    <p>“If you’d told me I’d spend my career working on vacuum tubes, I’d have said, ‘No way. That’s crazy!’ ”</p> 
          856    <p>So says Carter M. Armstrong, who has in fact spent the last 40-some years working on vacuum devices. It started in graduate school, when his Ph.D. advisor at the <a href="https://www.umd.edu/">University of Maryland</a> turned him on to electron beams. And it continued through stints at <a href="https://www.ncsu.edu/">North Carolina State University</a>, the <a href="https://www.nrl.navy.mil/">Naval Research Laboratory,</a> <a href="https://www.northropgrumman.com/">Northrop Grumman</a>, Litton, and most recently <a href="https://www.l3harris.com/">L3Harris</a>, in Torrance, Calif., where he is director of advanced development for the company’s Electron Devices division.</p> 
          857    <p>Throughout, Armstrong says, the work has been intellectually stimulating and emotionally rewarding. “It’s good to work on hard problems,” he says. “The physics is hard, the engineering is hard, and it’s all interrelated. Not everybody can do this kind of work, but it gets in your blood, it really does.”</p> 
          858    <p><span style="display:inline !important">In this photo, Armstrong, an IEEE Fellow, holds two of the devices he helped develop: a millimeter-wave mini traveling-wave tube and a microwave power module.&nbsp;</span> Beyond the ubiquitous magnetrons in microwave ovens and the traveling-wave tubes in communications satellites, he says, vacuum devices still find their way into a surprisingly broad array of applications where “you need high efficiency, high power, and wide amplification bandwidth.” Those applications include cancer therapy, fusion reactors, industrial heating, particle accelerators, radar, missile defense, and electronic warfare.</p> 
          859    <p>Almost all of the tubes in Armstrong’s article are ones he helped design or came across during his career, but he included one based on the recommendation of his son Derek, a musician. That’s the Telefunken VF14M, a specialized audio tube used in the much-revered Neumann U47 and U48 microphones. Over the decades, many recording artists have favored those mics, including Ella Fitzgerald, Frank Sinatra, and the Beatles.</p> 
          860    <p>“I’m a huge Beatles fan, so I was more than happy to include that one,” Armstrong says.</p> 
          861   </aside> 
          862  </div> 
          863 </div>]]></content:encoded>
          864       <dc:creator>Carter M. Armstrong</dc:creator>
          865       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMDY3Mw.jpeg" />
          866       <media:content url="https://spectrum.ieee.org/image/MzcxMDY3Mw.jpeg" />
          867     </item>
          868     <item>
          869       <title>Understanding Causality Is the Next Challenge for Machine Learning</title>
          870       <link>https://spectrum.ieee.org/tech-talk/artificial-intelligence/machine-learning/understanding-causality-is-the-next-challenge-for-machine-learning</link>
          871       <description>Teaching robots to understand "why" could help them transfer their knowledge to other environments</description>
          872       <category>artificial-intelligence</category>
          873       <category>artificial-intelligence/machine-learning</category>
          874       <pubDate>Thu, 29 Oct 2020 15:00:00 GMT</pubDate>
          875       <guid>https://spectrum.ieee.org/tech-talk/artificial-intelligence/machine-learning/understanding-causality-is-the-next-challenge-for-machine-learning</guid>
          876       <content:encoded><![CDATA[<p>“Causality is very important for the next steps of progress of machine learning,” said&nbsp;<a href="https://yoshuabengio.org/" target="_blank">Yoshua Bengio</a>, a Turing Award-wining scientist known for his work in deep learning, in an&nbsp;<a href="/tech-talk/artificial-intelligence/machine-learning/yoshua-bengio-revered-architect-of-ai-has-some-ideas-about-what-to-build-next" target="_blank">interview</a>&nbsp;with&nbsp;<em>IEEE Spectrum</em>&nbsp;in 2019. So far, deep learning has comprised learning from static datasets, which makes AI really good at tasks related to correlations and associations. However, neural nets do not interpret cause-and effect, or why these associations and correlations exist. Nor are they particularly good at tasks that involve imagination, reasoning, and planning. This, in turn, limits AI from being able to generalize their learning and transfer their skills to another related environment.</p> 
          877 <p>The lack of generalization is a big problem, says&nbsp;<a href="https://sites.google.com/view/causal-world/home" target="_blank">Ossama Ahmed</a>, a master’s student at <a href="https://ethz.ch/en.html">ETH Zurich</a> who has worked with Bengio’s team to develop a robotic benchmarking tool for causality and transfer learning. “Robots are [often] trained in simulation, and then when you try to deploy [them] in the real world…they usually fail to transfer their learned skills. One of the reasons is that the physical properties of the simulation are quite different from the real world,” says Ahmed. The group’s tool, called&nbsp;<a href="https://sites.google.com/view/causal-world/home" target="_blank">CausalWorld</a>, demonstrates that with some of the methods currently available, the generalization capabilities of robots aren’t good enough—at least not to the extent that “we can deploy [them] safely in any arbitrary situation in the real world,” says Ahmed.</p> 
          878 <p>The&nbsp;<a href="https://arxiv.org/abs/2010.04296" target="_blank">paper</a>&nbsp;on CausalWorld, available as a preprint, describes benchmarks in a simulated robotics manipulation environment using the open-source&nbsp;<a href="https://sites.google.com/view/trifinger" target="_blank">TriFinger robotics platform</a>. The main purpose of CausalWorld is to accelerate research in causal structure and transfer learning using this simulated environment, where learned skills could potentially be transferred to the real world. Robotic agents can be given tasks that comprise pushing, stacking, placing, and so on, informed by how children have been observed to play with blocks and learn to build complex structures. There is a large set of parameters, such as weight, shape, and appearance of the blocks and the robot itself, on which the user can intervene at any point to evaluate the robot’s generalization capabilities.</p> 
          879 <p>In their study, the researchers gave the robots a number of tasks ranging from simple to extremely challenging, based on three different curricula. The first involved no environment changes; the second had changes to a single variable; and the third allowed full randomization of all variables in the environment. They observed that as the curricula got more complex, the agents showed less ability to transfer their skills to the new conditions.</p> 
          880 <p>“If we continue scaling up training and network architectures beyond the experiments we report, current methods could potentially solve more of the block stacking environments we propose with CausalWorld,” points out&nbsp;<a href="https://ei.is.mpg.de/~ftraeuble" target="_blank">Frederik Träuble</a>, one of the contributors to the study. Träuble adds that “What’s actually interesting is that we humans can generalize much, much quicker [and] we don’t need such a vast amount of experience… We can learn from the underlying shared rules of [certain] environments…[and] use this to generalize better to yet other environments that we haven’t seen.”</p> 
          881 <p>A standard neural network, on the other hand, would require insane amounts of experience with myriad environments in order to do the same. “Having a model architecture or method that can learn these underlying rules or causal mechanisms, and utilize them could [help] overcome these challenges,” Träuble says.</p> 
          882 <p>CausalWorld’s evaluation protocols, say Ahmed and Träuble, are more versatile than those in previous studies because of the possibility of “disentangling” generalization abilities. In other words, users are free to intervene on a large number of variables in the environment, and thus draw systemic conclusions about what the agent generalizes to—or doesn’t. The next challenge, they say, is to actually use the tools available in CausalWorld to build more generalizable systems.</p> 
          883 <p>Despite how dazzled we are by AI’s ability to perform certain tasks, Yoshua Bengio, in 2019, estimated that present-day deep learning is less intelligent than a two-year-old child. Though the ability of neural networks to parallel-process on a large scale has given us breakthroughs in computer vision, translation, and memory, research is now shifting to developing novel deep architectures and training frameworks for addressing tasks like reasoning, planning, capturing causality, and obtaining systematic generalization. “I believe it’s just the beginning of a different style of brain-inspired computation,” Bengio said, adding, “I think we have a lot of the tools to get started.”</p> 
          884 <p></p>]]></content:encoded>
          885       <dc:creator>Payal Dhar</dc:creator>
          886       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMTAyNw.jpeg" />
          887       <media:content url="https://spectrum.ieee.org/image/MzcxMTAyNw.jpeg" />
          888     </item>
          889     <item>
          890       <title>Dart-Shooting Drone Attacks Trees for Science</title>
          891       <link>https://spectrum.ieee.org/automaton/robotics/drones/dart-shooting-drone</link>
          892       <description>Sensor-laden darts can collect data in hazardous or inaccessible environments</description>
          893       <category>robotics</category>
          894       <category>robotics/drones</category>
          895       <pubDate>Wed, 28 Oct 2020 22:00:00 GMT</pubDate>
          896       <guid>https://spectrum.ieee.org/automaton/robotics/drones/dart-shooting-drone</guid>
          897       <content:encoded><![CDATA[<link href="/ns/interactive/0118race-to-5g/css/5g-logo-treatments.css" rel="stylesheet"> 
          898 <div class="mobileHide"> 
          899  <div class="imgWrapper offsetLeft lt sm"> 
          900   <a href="/static/journal-watch"><img alt="Journal Watch report logo, link to report landing page" src="/image/MzI0MTAwOQ.jpeg"></a> 
          901  </div> 
          902 </div> 
          903 <p>We all know how robots are great at going to places where you can’t (or shouldn’t) send a human. We also know how robots are great at doing repetitive tasks. These characteristics have the potential to make robots ideal for setting up wireless sensor networks in hazardous environments—that is, they could deploy a whole bunch of self-contained sensor nodes that create a network that can monitor a very large area for a very long time.</p> 
          904 <p>When it comes to using drones to set up sensor networks, you’ve generally got two options: A drone that just drops sensors on the ground (easy but inaccurate and limited locations), or using a drone with some sort of manipulator on it to stick sensors in specific places (complicated and risky). A third option, under development by researchers at&nbsp;Imperial College London’s <a href="https://www.imperial.ac.uk/aerial-robotics">Aerial Robotics Lab</a>, provides the accuracy of direct contact with the safety and ease of use of passive dropping by instead using the drone as a launching platform for laser-aimed, sensor-equipped darts.&nbsp;</p> 
          905 <!--nextpage--> 
          906 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/duPRXCyo6cY?rel=0" width="620"></iframe></p> 
          907 <p>These darts (which the researchers refer to as aerodynamically stabilized, spine-equipped sensor pods) can embed themselves in relatively soft targets from up to 4 meters away with an accuracy of about&nbsp;10 centimeters&nbsp;after being fired from a spring-loaded launcher. They’re not quite as accurate as a drone with a manipulator, but it’s pretty good, and the drone can maintain a safe distance from the surface that it’s trying to add a sensor to. Obviously, the spine is only going to work on things like wood, but the researchers point out that there are plenty of attachment mechanisms that could be used, including magnets, adhesives, chemical bonding, or microspines.</p> 
          908 <p>Indoor tests using magnets showed the system to be quite reliable, but at close range (within a meter of the target) the darts sometimes bounced off rather than sticking. From between 1 and 4 meters away, the darts stuck between 90 and 100 percent of the time. Initial outdoor tests were also successful, although the system was under manual control. The researchers say that “regular and safe operations should be carried out autonomously,” which, yeah, you’d just have to deal with all of the extra sensing and hardware required to autonomously fly beneath the canopy of a forest. That’s happening next, as the researchers plan to add “vision state estimation and positioning, as well as a depth sensor” to avoid some trees and fire sensors into others.</p> 
          909 <p>And if all of that goes well, they’ll consider trying to get each drone to carry multiple darts. Look out, trees: You’re about to be pierced for science.</p> 
          910 <p>“Unmanned Aerial Sensor Placement for Cluttered Environments,” by André Farinha, Raphael Zufferey, Peter Zheng, Sophie F. Armanini, and Mirko Kovac from Imperial College London, was published in <em>IEEE Robotics and Automation Letters</em>.</p> 
          911 <div class="mobileShow"> 
          912  <h3 class="RptHdBackBarMobile"><span class="BackArrowBlkBkgrd">&lt;</span>&nbsp;<a href="/static/journal-watch">Back to IEEE Journal Watch</a></h3> 
          913 </div>]]></content:encoded>
          914       <dc:creator>Evan Ackerman</dc:creator>
          915       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMDgzNQ.jpeg" />
          916       <media:content url="https://spectrum.ieee.org/image/MzcxMDgzNQ.jpeg" />
          917     </item>
          918     <item>
          919       <title>IoT Network Companies Have Cracked Their Chicken-and-Egg Problem</title>
          920       <link>https://spectrum.ieee.org/telecom/wireless/iot-network-companies-have-cracked-their-chicken-and-egg-problem</link>
          921       <description>Which comes first, the network or the devices? Why not both?</description>
          922       <category>telecom</category>
          923       <category>telecom/wireless</category>
          924       <pubDate>Wed, 28 Oct 2020 19:00:00 GMT</pubDate>
          925       <guid>https://spectrum.ieee.org/telecom/wireless/iot-network-companies-have-cracked-their-chicken-and-egg-problem</guid>
          926       <content:encoded><![CDATA[<figure class="rt med" role="img"> 
          927  <img alt="Illustration of a figure looking at 3d pink squares." src="/image/MzcwMzgwMw.jpeg"> 
          928  <div class="ai"> 
          929   <figcaption class="hi-cap">
          930     Illustration: Greg Mably 
          931   </figcaption> 
          932  </div> 
          933 </figure> 
          934 <p><strong>Along with</strong> everything else going on, we may look back at 2020 as the year that companies finally hit upon a better business model for Internet of Things (IoT) networks. Established network companies such as <a href="https://www.thethingsnetwork.org/">the Things Network</a> and <a href="https://www.helium.com/">Helium</a>, and new players such as <a href="https://www.amazon.com/">Amazon</a>, have seemingly given up on the idea of making money from selling network connectivity. Instead, they’re focused on getting the network out there for developers to use, assuming in the process that they’ll benefit from the effort in the long run.</p> 
          935 <p>IoT networks have a chicken-and-egg problem. Until device makers see widely available networks, they don’t want to build products that run on the network. And customers don’t want to pay for a network, and thus, fund its development, if there aren’t devices available to use. So it’s hard to raise capital to build a new wide-area network (WAN) that provides significant coverage and supports a plethora of devices that are enticing to use.</p> 
          936 <p>It certainly didn’t help such network companies as <a href="https://www.ingenu.com/">Ingenu</a>, <a href="https://machineq.com/">MachineQ</a>, <a href="https://www.senetco.com/">Senet</a>, and <a href="https://www.sigfox.com/en">SigFox</a> that they’re all marketing half a dozen similar proprietary networks. Even the cellular carriers, which are promoting both LTE-M for machine-to-machine networks and NB-IoT for low-data-rate networks, have historically struggled to justify their investments in IoT network infrastructure. After COVID-19 started spreading in Japan, <a href="https://www.nttdocomo.co.jp/english/">NTT DoCoMo</a> called it quits on its NB-IoT network, citing a lack of demand.</p> 
          937 <p>“Personally, I don’t believe in business models for [low-power WANs],” says <a href="https://www.thethingsnetwork.org/u/wienkegiezeman">Wienke Giezeman</a>, the CEO and cofounder of the Things Network. His company does deploy long-range low-power WAN gateways for customers that use the <a href="https://lora-alliance.org/">LoRa Alliance</a>’s LoRaWAN specifications. (“LoRa” is short for “long-range.”) But Giezeman sees that as the necessary element for later providing the sensors and software that deliver the real IoT applications customers want to buy. Trying to sell both the network and the devices is like running a restaurant that makes diners buy and set up the stove before it cooks their meals.</p> 
          938 <p>The Things Network sets up the “stove” and includes the cost of operating it in the “meal.” But, because Giezeman is a big believer in the value of open source software and creating a sense of abundance around last-mile connectivity, he’s also asking customers to opt in to turning their networks into public networks.</p> 
          939 <p>Senet does something similar, letting customers share their networks. Helium is using cryptocurrencies to entice people to set up LoRaWAN hotspots on their networks and rewarding them with tokens for keeping the networks operational. When someone uses data from an individual’s Helium node, that individual also gets tokens that might be worth something one day. I actually run a Helium hotspot in my home, although I’m more interested in the LoRa coverage than the potential for wealth.</p> 
          940 <p>And there’s Amazon, which plans to embed its own version of a low-power WAN into its <a href="https://www.amazon.com/Amazon-Echo-And-Alexa-Devices/b/ref=amb_link_4?ie=UTF8&amp;node=9818047011&amp;pf_rd_m=ATVPDKIKX0DER&amp;pf_rd_s=merchandised-search-leftnav&amp;pf_rd_r=R3AVZGABKKQ7WEW2RT0K&amp;pf_rd_r=R3AVZGABKKQ7WEW2RT0K&amp;pf_rd_t=101&amp;pf_rd_p=a07256c5-b960-4333-87bd-b987dbb82542&amp;pf_rd_p=a07256c5-b960-4333-87bd-b987dbb82542&amp;pf_rd_i=9818047011">Echo</a> and its <a href="https://ring.com/">Ring</a> security devices. Whenever someone buys one of these devices they’ll have the option of adding it as a node on the <a href="https://blog.aboutamazon.com/devices/introducing-amazon-sidewalk">Amazon Sidewalk</a> network. Amazon’s plan is to build out a decentralized network for IoT devices, starting with a deal to let manufacturer <a href="https://www.thetileapp.com/en-us/about-tile">Tile</a> use the network for its ­<a href="https://www.bluetooth.com/">Bluetooth</a> tracking devices.</p> 
          941 <p>After almost a decade of following various low-power IoT networks, I’m excited to see them abandon the idea that the network is the big value, and instead recognize that it’s the things on the network that entice people. Let’s hope this year marks a turning point for low-power WANs.</p> 
          942 <p><em>This article appears in the November 2020 print issue as “Network Included.”</em></p>]]></content:encoded>
          943       <dc:creator>Stacey Higginbotham</dc:creator>
          944       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwMzk1OA.jpeg" />
          945       <media:content url="https://spectrum.ieee.org/image/MzcwMzk1OA.jpeg" />
          946     </item>
          947     <item>
          948       <title>World’s First Ocean Hybrid Platform Converts Tidal Waves Into Energy</title>
          949       <link>https://spectrum.ieee.org/news-from-around-ieee/the-institute/ieee-member-news/worlds-first-ocean-hybrid-platform-converts-tidal-waves-into-energy</link>
          950       <description>Built by Sinn Power, the technology uses a combination of wave, wind, and solar</description>
          951       <category>the-institute</category>
          952       <category>the-institute/ieee-member-news</category>
          953       <pubDate>Wed, 28 Oct 2020 18:00:00 GMT</pubDate>
          954       <guid>https://spectrum.ieee.org/news-from-around-ieee/the-institute/ieee-member-news/worlds-first-ocean-hybrid-platform-converts-tidal-waves-into-energy</guid>
          955       <content:encoded><![CDATA[<style type="text/css">.entry-content .tisubhead {
          956     color: #999999;
          957     font-family: verdana;
          958     font-size: 14px;
          959     font-weight: bold;
          960     letter-spacing: 1px;
          961     margin-bottom: -5px !important;
          962     text-transform: uppercase;
          963 }
          964 
          965 .tiblogopener {
          966     color: #a17e54;
          967    font-family: Theinhardt-Medium, sans-serif;
          968   letter-spacing: 1px;
          969   margin-right: 10px;
          970     font-weight: bold;
          971     text-transform: uppercase;
          972 }
          973 </style> 
          974 <p><span class="tiblogopener">THE INSTITUTE </span>Energy captured from tidal motion, waves, and currents can be used to produce electricity, providing power to millions of homes in the coming decades. Unlike other renewable energy sources, waves are easily forecasted and available 24/7. There is a tremendous amount of energy in the ocean. Water covers about 70&nbsp;percent of our planet, and because it is 830&nbsp;times denser than air, it can carry much more energy than wind per volume.</p> 
          975 <p>But despite the energy source’s great potential, it remains untapped.</p> 
          976 <p>Engineers have been trying to invent machines to generate electricity from water since the 18th&nbsp;century. In 1799 French engineer Pierre Girard and his son, Pierre-Simon, were granted a <a href="http://www.sciencedirect.com/science/article/pii/S1364032102000096?via%25253Dihub">patent</a> on the use of energy from ocean waves. They designed a machine to capture the energy in sea waves to power heavy machinery including mills and pumps. By attaching heavy wooden beams to docked battleships and taking advantage of the vessels’ bobbing to operate the beams as levers against fulcrums on shore, the Parisian inventors were able to operate pumps, sawmills, and other machines.</p> 
          977 <p><a href="http://www.sciencedirect.com/science/article/pii/S1364032102000096?via%25253Dihub">In 1910 French engineer M.&nbsp;Bochaux-Praceique built a device that likely was the first </a><a href="https://asmedigitalcollection.asme.org/offshoremechanics/article-abstract/129/4/273/458371/An-Investigation-Into-the-Hydrodynamic-Efficiency?redirectedFrom=fulltext">oscillating water column</a><em> </em>to generate electricity from sea waves.</p> 
          978 <p>Between 1940 and 1950, Yoshio Masuda, a former Japanese naval commander, developed a navigation buoy powered by wave energy. It was equipped with an air turbine. Many people regard Masuda as the father of modern wave-energy-conversion technology.</p> 
          979 <p>Since the 1950s, many inventors have come up with commercial-scale wave-energy designs, but few have worked. The ocean, with challenges that include corrosive water and unpredictable winds, makes things difficult.</p> 
          980 <p>The 1970s oil crisis was a turning point for the industry. That’s when experts started looking for alternative energy sources and reconsidered the ocean. It took until the 1990s, though, for actual research and development to start. The number of announced ocean-energy patents between 2009 and 2013 was 150.</p> 
          981 <p>Ocean-energy projects currently span the world, with activities in Australia, Canada, France, Japan, Korea, the United Kingdom and the United States. The key players were universities and startups until the recent entrance of bigger players. Now multinationals <a href="https://global.abb/group/en/technology/ventures">ABB Technology Ventures</a>, <a href="https://www.lockheedmartin.com/en-us/index.html">Lockheed Martin</a>, <a href="https://www.mhi.com/node/12">Mitsubishi Heavy Industries</a>, <a href="https://www.mes.co.jp/english/">Mitsui Engineering and Shipbuilding</a>, and <a href="https://www.naval-group.com/en/">Naval Group</a> have interest in the sector.</p> 
          982 <p>Some utilities are committed to the concept, such as Finnish energy company <a href="https://www.fortum.com/">Fortum</a>, Spanish utility <a href="https://www.iberdrola.com/home">Iberdrola</a>, French utility <a href="https://www.edf.fr/en/the-edf-group">EDF</a>, and Swedish energy giant <a href="https://group.vattenfall.com/">Vattenfall</a>.</p> 
          983 <p>Europe is at the forefront of the industry, with about half the world’s ocean-energy developers.</p> 
          984 <p>To learn how engineers and scientists are developing solutions, I spoke with experts in Germany who are leading a team to commercially deliver energy to customers in different parts of the world.</p> 
          985 <h3 class="tisubhead">SINN POWER</h3> 
          986 <p>In August I spoke with Philipp Sinn, founder of <a href="https://www.sinnpower.com/">Sinn Power</a>, a German green-energy startup founded in 2014. This year he and his colleagues began building and testing the world’s first ocean hybrid platform.</p> 
          987 <p>The floating platform uses a combination of wave, wind, and solar energy to harness renewable energy on the open seas, Sinn says. The company has been testing the structure, which has attracted investors, energy experts, scientists, and government officials from all over the world to Heraklion, the largest city on the Greek Island of Crete.</p> 
          988 <p>The wind, wave, and photovoltaic platform is scalable in capacity and can be designed to generate 80&nbsp;kilowatts to power small houses by the coast and up to 2&nbsp;megawatts to industrial buildings, Sinn says. The technology can be adapted to customers’ needs and location requirements, he adds.</p> 
          989 <p>He acknowledges that the maritime environment is challenging. All the energy systems on the platform contain sensitive components and power electronics that must not be exposed to any fluids, he says.</p> 
          990 <p>To cope with such conditions, the company developed a product family consisting of electric machines, power electronics, and storage solutions, all of which comply with International Protection Code&nbsp;68, which classifies and rates degrees of protection provided by mechanical casings and electrical enclosures against intrusion, dust, accidental contact, and immersion in deep water.</p> 
          991 <p>“We see [our company’s] technologies as a movement toward a sustainable future,” Sinn says. “The goal is to provide people all over the world with clean, reliable, and affordable energy harnessed from the power of the ocean.”</p> 
          992 <h3 class="tisubhead">A BRIGHT FUTURE</h3> 
          993 <p>Ocean energy is an essential step in achieving our global climate and sustainable-development objectives.</p> 
          994 <p>The global market for ocean energy is expected to reach 22&nbsp;million&nbsp;kW by 2025.</p> 
          995 <p>Development of ocean-energy production—from concept to commercial release—has been a slow, expensive process. For the industry to succeed, it is essential to get financial support from governments all over the world. It is also important to strengthen the cooperation between countries, especially with regard to joint projects and the exchange of technology.</p> 
          996 <p><em>IEEE Senior Member </em><a href="https://www.linkedin.com/in/qusi-alqarqaz/"><span lang="PT">Qusi Alqarqaz</span></a><em> is an electrical engineer with more than 28&nbsp;years of experience in the power industry. He is a contributor to </em><a href="http://theinstitute.ieee.org/search?q=qusi">The Institute</a> <em>and serves on its editorial advisory board.</em></p>]]></content:encoded>
          997       <dc:creator>Qusi Alqarqaz</dc:creator>
          998       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODI3Mg.jpeg" />
          999       <media:content url="https://spectrum.ieee.org/image/MzcwODI3Mg.jpeg" />
         1000     </item>
         1001     <item>
         1002       <title>New Stent-like Electrode Allows Humans to Operate Computers With Their Thoughts</title>
         1003       <link>https://spectrum.ieee.org/the-human-os/biomedical/bionics/new-stent-like-electrode-allows-humans-to-operate-computers-with-their-thoughts</link>
         1004       <description>First humans to receive a "stentrode" demonstrate that home brain-computer interface systems are feasible</description>
         1005       <category>biomedical</category>
         1006       <category>biomedical/bionics</category>
         1007       <pubDate>Wed, 28 Oct 2020 11:00:00 GMT</pubDate>
         1008       <guid>https://spectrum.ieee.org/the-human-os/biomedical/bionics/new-stent-like-electrode-allows-humans-to-operate-computers-with-their-thoughts</guid>
         1009       <content:encoded><![CDATA[<p>Two Australian men with&nbsp;neuromuscular&nbsp;disorders&nbsp;regained some personal independence after researchers implanted&nbsp;stent-like electrodes&nbsp;in their brains, allowing&nbsp;them to&nbsp;<span>operate&nbsp;computers using their thoughts</span>.&nbsp;</p> 
         1010 <p>This is the first time such a device, dubbed a “stentrode,”&nbsp;has been implanted in humans, according to its inventors. The system also makes real-world use of brain-computer interfaces (BCIs)—devices that enable direct communication between the brain and a computer—more feasible.</p> 
         1011 <p><span>The feat was <a href="https://jnis.bmj.com/content/early/2020/10/25/neurintsurg-2020-016862">described today in the&nbsp;</a></span><a href="https://jnis.bmj.com/content/early/2020/10/25/neurintsurg-2020-016862"><em>Journal of Neurointerventional Surgery</em></a><span>. “This paper represents the first fully implantable, commercial BCI&nbsp;</span><span>system that patients can take home and use,” says <a href="https://www.synchronmed.com/team">Tom Oxley</a>, founder of Melbourne-based&nbsp;<a href="https://www.synchronmed.com/">Synchron</a>, which developed the device.</span></p> 
         1012 <!--nextpage--> 
         1013 <p><span>Researchers have been experimenting for over 15 years with&nbsp;sensing&nbsp;human brain activity and converting it directly into computer commands.&nbsp;Most of these systems involve&nbsp;open brain surgery&nbsp;and leave&nbsp;hardware protruding from the head so that the systems can be studied&nbsp;in&nbsp;labs. It’s highly experimental, and only a handful of&nbsp;people&nbsp;have had it done.</span></p> 
         1014 <p>The stentrode offers a less intrusive way to get electrodes to the brain. The device is squeezed into a catheter and&nbsp;placed in the jugular vein<span>&nbsp;in the neck. From there the catheter&nbsp;snakes up through blood vessels until it reaches&nbsp;the motor cortex of the brain. Then it&nbsp;releases the stentrode, which holds 16 electrode contacts and springs out into a tube-like scaffold that fits against the walls of a blood vessel in that section of the brain.&nbsp;&nbsp;</span></p> 
         1015 <p>The stentrode is connected by a lead down to a device that is surgically&nbsp;implanted in the chest. The device provides power and data transmission. An&nbsp;external device interprets the signals from the brain using machine learning algorithms&nbsp;and converts them to computer commands.&nbsp;</p> 
         1016 <p>The quality of the signals is sacrificed a little by the electrodes being in a blood vessel, rather than directly on brain tissue. But it’s good enough to allow the two participants in the study to accurately type up to 20 characters per minute with predictive text disabled, and do online shopping and banking, all without lifting a finger or using voice commands.&nbsp;</p> 
         1017 <figure class="xlrg" role="img"> 
         1018  <img alt="Graham Felsted has a stentrode implant" src="/image/MzcxMDM4NQ.jpeg"> 
         1019  <figcaption class="hi-cap">
         1020    Photo:&nbsp;Synchron 
         1021  </figcaption> 
         1022  <figcaption>
         1023    Graham Felsted has a stentrode implant. He is using the BCI to write his book, Technopathy for Beginners. 
         1024  </figcaption> 
         1025 </figure> 
         1026 <p>The two participants who received the stentrodes suffer from amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease. The condition causes loss of muscle control and paralysis. Being able to remotely contact his wife and to be productive on a computer “has been life-altering,” says&nbsp;Graham Felstead, one of the two men with&nbsp;stentrode implants.</p> 
         1027 <p>Operating the system is painstaking. Felstead controls the computer by thinking about squeezing his right leg. These thoughts translate into commands such as left click or right click or zoom. A separate&nbsp;eye tracker allows him to move the cursor with his eye movements.&nbsp;The first thing Felstead typed was “We’re going to need more coffee.”&nbsp;Felstead has been using the device on his own for over a year now.</p> 
         1028 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/c5PEiXZFITA" width="620"></iframe></p> 
         1029 <p>Being able to use a BCI system at home, outside the lab, is&nbsp;a huge leap for this budding research sector. <span>The&nbsp;</span><a href="https://www.braingate.org/" style="box-sizing: border-box; color: rgb(0, 0, 0); background-color: rgb(255, 255, 255); font-family: Georgia, serif; font-size: 18px;">BrainGate</a><span>&nbsp;consortium and&nbsp;other groups&nbsp;have used BCI to enable&nbsp;people with neurologic diseases and paralysis to&nbsp;</span><a href="/the-human-os/biomedical/devices/paralyzed-individuals-operate-tablet-with-brain-implant" style="box-sizing: border-box; color: rgb(0, 0, 0); background-color: rgb(255, 255, 255); font-family: Georgia, serif; font-size: 18px;">operate tablets</a><span>,&nbsp;</span><a href="/the-human-os/biomedical/bionics/new-record-for-typing-by-brain-paralyzed-man-uses-brain-implant-to-type-8-words-per-minute" style="box-sizing: border-box; color: rgb(0, 0, 0); background-color: rgb(255, 255, 255); font-family: Georgia, serif; font-size: 18px;">type eight words per minute</a><span>&nbsp;and control&nbsp;</span><a href="/biomedical/bionics/a-better-way-for-brains-to-control-robotic-arms" style="box-sizing: border-box; color: rgb(0, 0, 0); background-color: rgb(255, 255, 255); font-family: Georgia, serif; font-size: 18px;">prosthetic limbs</a><span>&nbsp;using only their thoughts. But these systems involve hardware protruding from the skull and are relegated to a laboratory setting.&nbsp;</span></p> 
         1030 <p>Elon Musk in August <a href="/the-human-os/biomedical/devices/elon-musk-neuralink-advance-brains-ai">said that his&nbsp;<span>company,&nbsp;</span>Neuralink</a>,<span><a href="/the-human-os/biomedical/devices/elon-musk-neuralink-advance-brains-ai">&nbsp;had built a self-contained neural implant</a>&nbsp;that can wirelessly transmit detailed brain activity&nbsp;without the aid of external hardware. But Neuralink so far has only demonstrated the device in pigs.&nbsp;</span></p> 
         1031 <p>“It’s all hypothetical until we put the device in a human,” says Oxley. One of Synchron’s next moves is to seek permission from&nbsp;the U.S. Food and Drug Administration for permission to implant the device in people in the U.S.</p> 
         1032 <p></p> 
         1033 <p></p>]]></content:encoded>
         1034       <dc:creator>Emily Waltz</dc:creator>
         1035       <media:thumbnail url="https://spectrum.ieee.org/image/MzcxMDM1NA.jpeg" />
         1036       <media:content url="https://spectrum.ieee.org/image/MzcxMDM1NA.jpeg" />
         1037     </item>
         1038     <item>
         1039       <title>The Tuatara Is The World's Fastest Production Car</title>
         1040       <link>https://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/the-tuatara-is-the-worlds-fastest-production-car</link>
         1041       <description>The key was the lapidary engine, together with a passel of sensors and diagnostics</description>
         1042       <category>transportation</category>
         1043       <category>transportation/advanced-cars</category>
         1044       <pubDate>Tue, 27 Oct 2020 19:30:00 GMT</pubDate>
         1045       <guid>https://spectrum.ieee.org/cars-that-think/transportation/advanced-cars/the-tuatara-is-the-worlds-fastest-production-car</guid>
         1046       <content:encoded><![CDATA[<p>When Jerod&nbsp;Shelby looked to capture evidence of his SSC Tuatara gunning to new world speed record for production cars—on a public road in Nevada—an iPhone wouldn’t cut it. He had to requisition a T33 jet as the “camera vehicle” to keep pace with his 1,750-horsepower hypercar.</p> 
         1047 <p></p> 
         1048 <p>Between that subsonic jet’s gyro-stabilized nose camera, two Guinness record observers, and GPS data from about 15 satellites, the evidence is in: On 10 October, on a 11-kilometer&nbsp;stretch of State Route 160, between Las Vegas and Pahrump, the Tuatara time-warped to an insane top speed of <span>&nbsp;533 kph—that’s&nbsp;</span>331.15 mph. That blew away the Koenigsegg Agera RS’s one-way high of 458 kph, achieved on the same closed-down desert highway in 2017. The Tuatara’s two-direction average of 508 kph—the key record, in accordance with rules to account for wind and road-grade changes—easily topped the Bugatti Chiron’s 490-kph pace on Volkswagen’s Ehra-Lessien test track in Germany.</p> 
         1049 <p>It’s all part of what Shelby dubs “the land-based space race.” If you’re thinking there’s a little competition going on, you’d be right.</p> 
         1050 <figure class="xlrg" role="img"> 
         1051  <img alt="Jarod Shelby’s 24-employee company has topped the biggest names in speed, including Bugatti and McLaren." src="/image/MzcwOTkxNA.jpeg"> 
         1052  <figcaption class="hi-cap">
         1053    Photo: SSC North America 
         1054  </figcaption> 
         1055  <figcaption>
         1056    Jerod&nbsp;Shelby’s 24-employee company has topped the biggest names in speed, including Bugatti and McLaren. 
         1057  </figcaption> 
         1058 </figure> 
         1059 <p>The Tuatara represents a decade of development for Shelby, a former go-kart racer and mechanical engineer who founded Shelby Supercars (now SSC North America) in his Richland, Washington hometown in 1998. Shelby—no relation to Carroll Shelby of <em>Ford v. Ferrari </em>fame—first captured the auto world’s attention with his Ultimate Aero, which set its own record in 2007 by nipping 412 kph.</p> 
         1060 <p></p> 
         1061 <p></p> 
         1062 <p></p> 
         1063 <p>“What’s crazy to me is how technology advances,” Shelby says. “When we broke the world record in 2007, it was so difficult to design a car to surpass the McLaren. Who would have thought that 13 years later, we’d go 75-mph faster, in a car that you could drive to dinner with your wife and valet?”</p> 
         1064 <p></p> 
         1065 <p>Valet? If you can afford the Tuatara’s US $1.9-million price, you can afford Jeeves, too.</p> 
         1066 <p></p> 
         1067 <p>The Tuatara’s fast food includes a twin-turbo, 5.9-liter, all-alloy V-8 created by Nelson Racing Engines, the Chatsworth, Calif. company that builds specialized powerplants for racing and the street. The clean-sheet design features a flat-plane crankshaft and a lofty 8,800-rpm redline.</p> 
         1068 <p></p> 
         1069 <figure class="xlrg" role="img"> 
         1070  <img alt="The Tautara’s beating heart is a mid-mounted, twin-turbo V-8 that makes 1,750 horsepower on E85 gasoline-ethanol fuel." src="/image/MzcwOTkxNQ.jpeg"> 
         1071  <figcaption class="hi-cap">
         1072    Photo: SSC North America 
         1073  </figcaption> 
         1074  <figcaption>
         1075    The Tautara’s beating heart is a mid-mounted, twin-turbo V-8 that makes 1,750 horsepower on E85 gasoline-ethanol fuel. 
         1076  </figcaption> 
         1077 </figure> 
         1078 <p>Tom Nelson, the company’s founder, says the bespoke engine sounds… glorious.</p> 
         1079 <p></p> 
         1080 <p>“It’s like a mechanical symphony to me,” Nelson says. “And at the top end is where I really love the sound.”</p> 
         1081 <p></p> 
         1082 <figure class="xlrg" role="img"> 
         1083  <img alt="29-year-old British racer Oliver Webb piloted the Tuatara through tricky crosswind to set the speed record." src="/image/MzcwOTkxNg.jpeg"> 
         1084  <figcaption class="hi-cap">
         1085    Photo: SSC North America 
         1086  </figcaption> 
         1087  <figcaption>
         1088    29-year-old British racer Oliver Webb piloted the Tuatara through tricky crosswind to set the speed record. 
         1089  </figcaption> 
         1090 </figure> 
         1091 <p>No detail was left to chance for the record run by Oliver Webb, the 29-year-old British pro whose resume includes a European LeMans Series championship. For one, unlike a Bugatti, backed by the global might of the VW Group, Shelby couldn’t blow up his roughly $150,000 engine and casually drop in another. Its single engine not only survived a year of rigorous testing and development, but helped set the speed record on its first attempt. For buyers of the seven-figure beast — even if most will never see even 240 kph (150&nbsp;mph)—durability is a rightful concern. Shelby plans to build 100 copies of the Tuatara, aptly named after a <a href="https://www.nature.com/news/2008/080327/full/news.2008.695.html">New Zealand reptile</a> whose DNA is among the fastest-evolving of any vertebrate.</p> 
         1092 <p></p> 
         1093 <figure class="rt med" role="img"> 
         1094  <img alt="The bespoke, flat-crank V-8 spins to 8,800-rpm, with titanium and Inconel components." src="/image/MzcwOTkyMg.jpeg"> 
         1095  <figcaption class="hi-cap">
         1096    Photo: Nelson Racing Engines 
         1097  </figcaption> 
         1098  <figcaption>
         1099    The bespoke, flat-crank V-8 spins to 8,800-rpm, with titanium and Inconel components. 
         1100  </figcaption> 
         1101 </figure> 
         1102 <p>The engine’s turbine wheels, exhaust valves and 3D-printed exhaust collectors are formed from Inconel, the nickel-chromium-based super alloy that’s used in Formula One racing. Connecting rods are a special grade of titanium. Machined-gold pins connect wiring harnesses, ensuring they don’t loosen or corrode. Rotating masses are trimmed by 25 percent versus even Nelson’s typical engines, quelling the second-order vibrations large, flat-crank engines are prone to have. An army of sensors, including 11 for exhaust temperatures alone, will put the Tuatara in a protective limp mode if any operating parameter exceeds tolerances by 15 percent. The engine’s jewel-like build shined through when Nelson and his team ran the engine well beyond its normal range on a test balancer.</p> 
         1103 <p></p> 
         1104 <p>“You can spin this crankshaft to 10,000-rpm, put a wine glass on the balancer, and it won’t move,” Nelson says.</p> 
         1105 <p></p> 
         1106 <p>The computerized oversight came in handy in a week of “low-speed,” roughly 400-kph preliminary tests in Washington state, when the Tuatara’s exhaust-gas temperatures shot as high as 1, 077 degrees Celsius (1,970 degrees Fahrenheit).</p> 
         1107 <p></p> 
         1108 <p>“We couldn’t figure out what was going on, and we were going to abort,” Nelson says.</p> 
         1109 <p></p> 
         1110 <p>The team theorized that fuel ignition was lagging because the ignition coils weren’t going to ground quickly enough. A new resistor in the ignition system did the trick.</p> 
         1111 <p></p> 
         1112 <p>“It was like magic, the exhaust temperatures dropped 300 degrees,” Nelson says.</p> 
         1113 <p></p> 
         1114 <figure class="xlrg" role="img"> 
         1115  <img alt="The slippery body, by a former Ferrari designer, claims class-leading coefficient of drag." src="/image/MzcwOTkyMw.jpeg"> 
         1116  <figcaption class="hi-cap">
         1117    Photo: SSC North America 
         1118  </figcaption> 
         1119  <figcaption>
         1120    The slippery body, by a former Ferrari designer, claims class-leading coefficient of drag. 
         1121  </figcaption> 
         1122 </figure> 
         1123 <p>All the power in the world won’t help if a hypercar can’t overcome monstrous aerodynamic drag as it approaches 500 kph. Jason Castriota, the former Pininfarina designer responsible for several Ferrari and Maserati models, penned the rear-drive Tuatara’s slippery shape. That includes a class-best coefficient of drag of 0.279. That compares with a relatively truck-like .340 for the $1 million McLaren P1, one of history’s fastest showroom cars. A circulatory system of air channels directing air in and out of the body helps the Tuatara defeat drag, keeps systems cool and generates better than 360 kilograms (800 pounds) of downforce at V-Max. That keeps the car pinned to the ground while maintaining its ideal balance, with 63 percent of aero pressure over rear wheels, from 240 kph to its apogee, at which point the Tuatara was covering 1.5 football fields per second.</p> 
         1124 <p></p> 
         1125 <p>Stability became more critical after dawn on 10 Oct., as a slight but perilous crosswind made for a hair-raising record attempt. Forecasts called for winds to soon top the 10-mph safety limit. The mildest ripple beyond that, and Webb would be taking his young life in his hands. Webb’s wife was home in Los Angeles, seven months pregnant and too anxious to watch.</p> 
         1126 <p></p> 
         1127 <p>With nearly 200 people on site, including a film crew, Webb grew concerned about mounting pressures and expectations. Reality was intruding, quickly: This was a two-lane thread of asphalt through the Mojave, traversed daily by tourists dreaming of a Vegas payday, not a wide-open airport tarmac or racetrack with runoff and barriers for safety.</p> 
         1128 <p></p> 
         1129 <p>“He said, ‘I’m only going to do one more run; I’m willing to go one more time and give it my all.’” Shelby recalls.</p> 
         1130 <p></p> 
         1131 <figure class="xlrg" role="img"> 
         1132  <img alt="SSC plans to build 100 copies of the record-setting Tuatara, priced from $1.9 million." src="/image/MzcwOTkyNA.jpeg"> 
         1133  <figcaption class="hi-cap">
         1134    Photo: SSC North America 
         1135  </figcaption> 
         1136  <figcaption>
         1137    SSC plans to build 100 copies of the record-setting Tuatara, priced from $1.9 million. 
         1138  </figcaption> 
         1139 </figure> 
         1140 <p>Webb blasted off, focusing eyes as far into the distance as possible, the dotted-white line going solid in his vision as the howling Tuatara punched a hole through the air. Shelby and his two sons jumped into a rental van and raced to the end of the course. They encountered Webb, overcome with emotion, and figured the record bid had failed.</p> 
         1141 <p></p> 
         1142 <p>“Oliver was sitting on the ground when we pulled up, head in hands, and it didn’t look good,” Shelby says. “He said, ‘I’m done, I’m never doing this again; the wind pushed me right onto the shoulders.”</p> 
         1143 <p></p> 
         1144 <p>“But all of a sudden he looked up, smiled, and said, “I saw a really big number on the display, but I had to look away to save the car.”</p> 
         1145 <p></p> 
         1146 <p>Pulling the data, Shelby saw that big, big number: 331.15 mph. Webb stretched out on his back on the highway, exultant.</p> 
         1147 <p></p> 
         1148 <figure class="xlrg" role="img"> 
         1149  <img alt="Shelby and Webb set a new high in the “land-based space race.”" src="/image/MzcwOTkyNQ.jpeg"> 
         1150  <figcaption class="hi-cap">
         1151    Photo: SSC North America 
         1152  </figcaption> 
         1153  <figcaption>
         1154    Shelby and Webb set a new high in the “land-based space race.” 
         1155  </figcaption> 
         1156 </figure> 
         1157 <p>“He was the fastest man on the planet,” Shelby says of Webb. “And this isn’t the finish line. We’re ready to scale up, and look forward to working with customers.”</p> 
         1158 <p></p> 
         1159 <p>The land-speed battle isn’t settled, either. Along with Sweden’s Koenigsegg, Texas’ John Hennessey may be gunning to top SSC’s breakneck pace. They’d better ante up. Nelson says the car was running about 200 horses below its ultimate capacity. Webb, in the swashbuckling style of a born racer, says the Tuatara has more to give.</p> 
         1160 <p></p> 
         1161 <p>“With better conditions, I know we could have gone faster,” Webb says. “As I approached 331 mph, the Tuatara climbed almost 20 mph within the last five seconds, and it was still pulling well. The crosswinds are all that prevented us from realizing the car’s limit.”</p> 
         1162 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/CooK8mP3BOs?rel=0" width="620"></iframe></p>]]></content:encoded>
         1163       <dc:creator>Lawrence Ulrich</dc:creator>
         1164       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwOTg3Mg.jpeg" />
         1165       <media:content url="https://spectrum.ieee.org/image/MzcwOTg3Mg.jpeg" />
         1166     </item>
         1167     <item>
         1168       <title>New AI Inferencing Records</title>
         1169       <link>https://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/new-ai-inferencing-records</link>
         1170       <description>Nvidia tops MLPerf records again, consortium adds benchmarks to measure mobile</description>
         1171       <category>robotics</category>
         1172       <category>robotics/artificial-intelligence</category>
         1173       <pubDate>Tue, 27 Oct 2020 18:00:00 GMT</pubDate>
         1174       <guid>https://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/new-ai-inferencing-records</guid>
         1175       <content:encoded><![CDATA[<p><a href="https://mlperf.org/">MLPerf</a>, a consortium of AI experts and computing companies, has released a new set of <a href="https://mlperf.org/press/">machine learning records</a>. The records were set on a series of <a href="https://mlperf.org/inference-overview">benchmarks</a> that measure the speed of inferencing: how quickly an already-trained neural network can accomplish its task with new data. For the first time, benchmarks for mobiles and tablets were contested. According to <a href="https://www.linkedin.com/in/kanterd/">David Kanter</a>, executive director of MLPerf’s parent organization, a downloadable app is in the works that will allow anyone to test the <a href="/artificial-intelligence">AI</a> capabilities of their own smartphone or tablet.</p> 
         1176 <p>MLPerf’s goal is to present a fair and straightforward way to compare AI systems. Twenty-three organizations—including <a href="https://www.dell.com/en-us">Dell</a>, <a href="https://www.intel.com/content/www/us/en/homepage.html">Intel</a>, and <a href="https://www.nvidia.com/en-us/">Nvidia</a>—submitted a total of 1200 results, which were peer reviewed and subjected to random third-party audits. (Google was conspicuously absent this round.) As with the <a href="/tech-talk/artificial-intelligence/machine-learning/new-records-for-ai-training">MLPerf records for training AIs</a> released over the summer, Nvidia was the dominant force, besting what competition there was in all six categories for both <a href="https://mlperf.org/inference-results-0-7/">datacenter</a> and <a href="https://mlperf.org/inference-results-0-7/">edge</a> computing systems. Including submissions by partners like Cisco and Fujitsu, 1029 results, or 85 percent of the total for edge and data center categories, used Nvidia chips, according to the company.</p> 
         1177 <p>“Nvidia outperforms by a wide range on every test,” says Paresh Kharaya, senior director of product management, accelerated computing at Nvidia. Nvidia’s <a href="https://www.nvidia.com/en-us/data-center/a100/">A100 GPUs</a> powered its wins in the datacenter categories, while its <a href="https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-agx-xavier/">Xavier</a> was behind the GPU-maker’s edge-computing victories. According to Kharaya, on one of the new MLPerf benchmarks, Deep Learning Recommendation Model (DLRM), a single <a href="https://www.nvidia.com/en-us/data-center/dgx-a100/">DGX A100</a> system was the equivalent of 1000 CPU-based servers.</p> 
         1178 <p>There were four new inferencing benchmarks introduced this year, adding to the two carried over from the <a href="https://mlperf.org/inference-results-0-5">previous round</a>:</p> 
         1179 <ul> 
         1180  <li>BERT, for Bi-directional Encoder Representation from Transformers, is a natural language processing AI contributed by Google. Given a question input, BERT predicts a suitable answer.</li> 
         1181  <li>DLRM, for Deep Learning Recommendation Model is a recommender system that is trained to optimize click-through rates. It’s used to recommend items for online shopping and rank search results and social media content. Facebook was the major contributor of the DLRM code.</li> 
         1182  <li>3D U-Net is used in medical imaging systems to tell which 3D voxel in an MRI scan are parts of a tumor and which are healthy tissue. It’s trained on a dataset of brain tumors.</li> 
         1183  <li>RNN-T, for Recurrent Neural Network Transducer, is a speech recognition model. Given a sequence of speech input, it predicts the corresponding text.</li> 
         1184 </ul> 
         1185 <p>In addition to those new metrics, MLPerf put together the first set of benchmarks for mobile devices, which were used to test <a href="https://mlperf.org/inference-results-0-7/">smartphone and tablet platforms</a> from MediaTek, Qualcomm, and Samsung as well as a <a href="https://mlperf.org/inference-results-0-7/">notebook</a> from Intel. The new benchmarks included:</p> 
         1186 <ul> 
         1187  <li>MobileNetEdgeTPU, an image classification benchmark that is considered the most ubiquitous task in computer vision. It’s representative of how a photo app might be able pick out the faces of you or your friends.</li> 
         1188  <li>SSD-MobileNetV2, for Single Shot multibox Detection with MobileNetv2, is trained to detect 80 different object categories in input frames with 300x300 resolution. It’s commonly used to identify and track people and objects in photography and live video.</li> 
         1189  <li>DeepLabv3+ MobileNetV2: This is used to understand a scene for things like VR and navigation, and it plays a role in computational photography apps.&nbsp;</li> 
         1190  <li>MobileBERT is a mobile-optimized variant of the larger natural language processing BERT model that is fine-tuned for question answering. Given a question input, the MobileBERT generates an answer.</li> 
         1191 </ul> 
         1192 <figure class="xlrg" role="img"> 
         1193  <a class="zoom" href="/image/MzcwODQ2Nw.jpeg" rel="lightbox"><img alt="Nvidia's A100 swept the board in AI inferencing tasks where the data was available all at once (offline) or delivered as it would be in online (server)." src="/image/MzcwODQ2Nw.jpeg"><span class="magnifier">&nbsp;</span></a> 
         1194  <div class="ai"> 
         1195   <figcaption class="hi-cap">
         1196     Image: NVIDIA 
         1197   </figcaption> 
         1198   <figcaption>
         1199     Nvidia’s A100 swept the board in AI inferencing tasks where the data was available all at once (offline) or delivered as it would be in online (server). 
         1200   </figcaption> 
         1201  </div> 
         1202 </figure> 
         1203 <p>The benchmarks were run on a purpose-built app that should be available to everyone within months, according to Kanter. “We want something people can put into their hands for newer phones,” he says.</p> 
         1204 <p>The results released this week were dubbed version 0.7, as the consortium is still ramping up. Version 1.0 is likely to be complete in 2021.</p>]]></content:encoded>
         1205       <dc:creator>Samuel K. Moore</dc:creator>
         1206       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODQwMg.jpeg" />
         1207       <media:content url="https://spectrum.ieee.org/image/MzcwODQwMg.jpeg" />
         1208     </item>
         1209     <item>
         1210       <title>Spotting Mystery Methane Leaks From Space
</title>
         1211       <link>https://spectrum.ieee.org/aerospace/satellites/spotting-mystery-methane-leaks-from-space</link>
         1212       <description>A fleet of microsatellites will identify emitters of the gas, which is responsible for a quarter of global warming
</description>
         1213       <category>aerospace</category>
         1214       <category>aerospace/satellites</category>
         1215       <pubDate>Tue, 27 Oct 2020 15:00:00 GMT</pubDate>
         1216       <guid>https://spectrum.ieee.org/aerospace/satellites/spotting-mystery-methane-leaks-from-space</guid>
         1217       <content:encoded><![CDATA[<link href="/ns/interactive/0118race-to-5g/css/5g-logo-treatments.css" rel="stylesheet"> 
         1218 <style type="text/css">.covidHeader {
         1219     color: #FFFFFF;
         1220   font-family: Theinhardt-Medium, sans-serif;
         1221     font-size: 12px;
         1222     font-weight: bold;
         1223     letter-spacing: 1px;
         1224     margin-bottom: -5px !important;
         1225     text-transform: uppercase;
         1226   text-align: center;
         1227 }.covidBdy {
         1228     font-size: 14px;
         1229   text-align: center;
         1230 }
         1231 .aside.inlay.rt.med .oilHR {
         1232   border-top: 1px dashed red;
         1233   }
         1234 </style> 
         1235 <figure class="xlrg" role="img"> 
         1236  <img alt="Photo of a satellite on a table.  " src="/image/MzcwNTk4Ng.jpeg"> 
         1237  <div class="ai"> 
         1238   <figcaption class="hi-cap">
         1239     Photo: GHGSat 
         1240   </figcaption> 
         1241   <figcaption> 
         1242    <strong>Now in Orbit: </strong>Our newest satellite, Iris, launched in September and underwent electromagnetic testing earlier this year. 
         1243   </figcaption> 
         1244  </div> 
         1245 </figure> 
         1246 <p><strong>Something new happened in space in January 2019.</strong> For the first time, a previously unknown leak of natural gas was spotted from orbit by a microsatellite, and then, because of that detection, plugged.</p> 
         1247 <p>The microsatellite, <a href="https://www.ghgsat.com/our-platforms/claire/">Claire</a>, had been flying since 2016. That day, Claire was monitoring the output of a mud volcano in Central Asia when it spied a plume of methane where none should be. Our team at <a href="https://www.ghgsat.com/">GHGSat</a>, in Montreal, instructed the spacecraft to pan over and zero in on the origin of the plume, which turned out to be a facility in an oil and gas field in Turkmenistan.</p> 
         1248 <p>The need to track down methane leaks has never been more important. In the slow-motion calamity that is climate change, methane emissions get less public attention than the carbon dioxide coming from smokestacks and tailpipes. But methane—which mostly comes from fossil-fuel production but also from livestock farming and other sources—has an outsize impact. Molecule for molecule, methane traps 84 times as much heat in the atmosphere as carbon dioxide does, and it accounts for about a quarter of the rise in atmospheric temperatures. Worse, research from earlier this year shows that we might be enormously underestimating the amount released—by as much as 25 to 40 percent.</p> 
         1249 <p>Satellites have been able to see greenhouse gases like methane and carbon dioxide from space for nearly 20 years, but it took a confluence of need and technological innovation to make such observations practical and accurate enough to do them for profit. Through some clever engineering and a more focused goal, our company has managed to build a 15-kilogram microsatellite and perform feats of detection that previously weren’t possible, even with a US $100 million, 1,000-kg spacecraft. Those scientific behemoths do their job admirably, but they view things on a kilometer scale. Claire can resolve methane emissions down to tens of meters. So a polluter (or anybody else) can determine not just what gas field is involved but which well in that field.</p> 
         1250 <figure class="xlrg" role="img"> 
         1251  <img alt="Photo of several images of methane plumes " src="/image/MzcwNjAyMw.jpeg"> 
         1252  <div class="ai"> 
         1253   <figcaption class="hi-cap">
         1254     Images: GHGSat 
         1255   </figcaption> 
         1256   <figcaption> 
         1257    <strong>Eye in the Sky:</strong> The microsatellite Claire has spotted a number of methane plumes over the last four years, including at the following locations: 1) the Balkan Region of western Turkmenistan; 2) a gas facility in Yamalo-Nenets Autonomous Okrug, in northwestern Siberia; 3) the Permian Basin, in western Texas;&nbsp; 4) the Lom Pangar Dam, in eastern Cameroon;&nbsp; and 5) a coal mine in Shanxi, China.&nbsp; 
         1258    <strong>&nbsp;</strong> 
         1259   </figcaption> 
         1260  </div> 
         1261 </figure> 
         1262 <p>Since launching Claire, our first microsatellite, we’ve improved on both the core technology—a miniaturized version of an instrument known as a wide-angle Fabry-Pérot imaging spectrometer—and the spacecraft itself. Our second methane-seeking satellite, dubbed Iris, launched this past September, and a third is scheduled to go up before the end of the year. When we’re done, there will be nowhere on Earth for methane leaks to hide.</p> 
         1263 <p></p> 
         1264 <p><strong>The creation of Claire</strong> and its siblings was driven by a business case and a technology challenge. The business part was born in mid-2011, when <a href="http://www.environnement.gouv.qc.ca/changementsclimatiques/marche-carbone_en.asp">Quebec (GHGSat’s home province) and California</a> each announced that they would implement a market-based “cap and trade” system. The systems would attribute a value to each ton of carbon emitted by industrial sites. Major emitters would be allotted a certain number of tons of carbon—or its equivalent in methane and other greenhouse gases—that they could release into the atmosphere each year. Those that needed to emit more could then purchase emissions credits from those that needed less. Over time, governments could shrink the total allotment to begin to reduce the drivers of climate change.</p> 
         1265 <p>Even in 2011, there was a wider, multibillion-dollar market for carbon emissions, which was growing steadily as more jurisdictions imposed taxes or implemented carbon-trading mechanisms. By 2019, these carbon markets covered 22 percent of global emissions and earned governments $45 billion, according to the <em><a href="https://openknowledge.worldbank.org/handle/10986/33809">World Bank’s State and Trends of Carbon Pricing 2020</a></em>.</p> 
         1266 <p>Despite those billions, it’s methane, not carbon dioxide, that has become the focus of our systems. One reason is technological—our original instrument was better tuned for methane. But the business reason is the simpler one: Methane has value whether there’s a greenhouse-gas trading system or not.</p> 
         1267 <p>Markets for greenhouse gases motivate the operators of industrial sites to better measure their emissions so they can control and ultimately reduce them. Existing, mostly ground-based methods using systems like <a href="https://www.sciencedirect.com/topics/earth-and-planetary-sciences/flux-chamber">flux chambers</a>, <a href="https://en.wikipedia.org/wiki/Eddy_covariance">eddy covariance towers</a>, and <a href="https://www.flir.com/instruments/optical-gas-imaging/">optical gas imaging</a> were fairly expensive, of limited accuracy, and varied as to their geographic availability. Our company’s bet was that industrial operators would flock to a single, less expensive, more precise solution that could spot greenhouse-gas emissions from individual industrial facilities anywhere in the world.</p> 
         1268 <figure class="xlrg" role="img"> 
         1269  <img alt="Photo of people in gowns in front of a satellite.  " src="/image/MzcwNjA4Mw.jpeg"> 
         1270  <div class="ai"> 
         1271   <figcaption class="hi-cap">
         1272     Photo: GHGSat 
         1273   </figcaption> 
         1274   <figcaption> 
         1275    <strong>Very Proud Parents:</strong> The team at the Space Flight Laboratory, in Toronto, with a brand new 15-kilogram methane-sensing microsatellite. 
         1276   </figcaption> 
         1277  </div> 
         1278 </figure> 
         1279 <p>Once we’d decided on our business plan, the only question was: Could we do it?</p> 
         1280 <p>One part of the question had already been answered, to a degree, by pioneering space missions such as Europe’s <a href="https://en.wikipedia.org/wiki/Envisat">Envisat</a> (which operated from 2002 to 2012) and Japan’s <a href="https://en.wikipedia.org/wiki/Greenhouse_Gases_Observing_Satellite">GOSat</a> (launched in 2009). These satellites measure surface-level trace gases using spectrometers that collect sunlight scattering off the earth. The spectrometers break down the incoming light by wavelength. Molecules in the light’s path will absorb a certain pattern of wavelengths, leaving dark bands in the spectrum. The greater the concentration of those molecules, the darker the bands. This method can measure methane concentrations from orbit with a precision that’s better than 1 percent of background levels.</p> 
         1281 <p>While those satellites proved the concept of methane tracking, their technology was far from what we needed. For one thing, the instruments are huge. The spectrometer portion of Envisat, called <a href="https://www.iup.uni-bremen.de/sciamachy/instrument/design/index.html">SCIAMACHY</a> (SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY), contained nearly <a href="https://www.iup.uni-bremen.de/sciamachy/instrument/design/index.html">200 </a>kg of <a href="https://earth.esa.int/web/guest/missions/esa-operational-eo-missions/envisat/instruments/sciamachy-handbook/wiki/-/wiki/SCIAMACHY%20Handbook/Optical+Assembly">complex optics</a>; the entire spacecraft carried eight other scientific instruments and weighed 8.2 metric tons. <a href="https://global.jaxa.jp/projects/sat/gosat/">GOSat</a>, which is dedicated to greenhouse-gas sensing, weighs 1.75 metric tons.</p> 
         1282 <p>Furthermore, these systems were designed to measure gas concentrations across the whole planet, quickly and repeatedly, in order to inform global climate modeling. Their instruments scan huge swaths of land and then average greenhouse-gas levels over tens or hundreds of square kilometers. And that is far too coarse to pinpoint an industrial site responsible for rogue emissions.</p> 
         1283 <p>To achieve our goals, we needed to design something that was the first of its kind—an orbiting hyperspectral imager with spatial resolution in the tens of meters. And to make it affordable enough to launch, we had to fit it in a 20-by-20-by-20-centimeter package.</p> 
         1284 <p><strong>The most critical</strong> enabling technology to meet those constraints was our spectrometer—the wide-angle Fabry-Pérot etalon (WAF-P). (An etalon is an interferometer made from two partially reflective plates.) To help you understand what that is, we’ve first got to explain a more common type of spectrometer and how it works in a hyperspectral imaging system.</p> 
         1285 <aside class="inlay rt med"> 
         1286  <h3 class="sb-hed">Sensing Methane From Space</h3> 
         1287  <p>The satellite measures the way a plume of gas [pink] absorbs portions of the spectrum of reflected sunlight. The key instrument involved is called a wide-angle Fabry-Pérot etalon.&nbsp; Right:&nbsp;Two infrared rays of different wavelengths streaking up to the satellite [top] from different points on the ground enter the satellite at different angles.</p> 
         1288  <ul class="sb-list"> 
         1289   <li> 
         1290    <figure class="med" role="img"> 
         1291     <a class="zoom" href="/image/MzcwNjM4OA.jpeg" rel="lightbox"><img alt="The satellite measures the way a plume of gas [pink] absorbs portions of the spectrum of reflected sunlight. The key instrument involved is called a wide-angle Fabry-Pérot etalon.  Right: Two infrared rays of different wavelengths streaking up to the satellite [top] from different points on the ground enter the satellite at different angles." src="/image/MzcwNjM4OA.jpeg"><span class="magnifier">&nbsp;</span></a> 
         1292    </figure> </li> 
         1293   <li> <p>The etalon is made up of two partially mirrored surfaces [bottom] held micrometers apart. A portion of the light passes through both surfaces; the rest reflects within the mirrored cavity before it passes through. If the light is of the right wavelength and enters at a particular angle, it will constructively interfere with itself [left]. The result is an angle-dependent wavelength filter [right].</p> 
         1294    <figure class="med" role="img"> 
         1295     <a class="zoom" href="/image/MzcwNjM5Nw.jpeg" rel="lightbox"><img alt="The etalon is made up of two partially mirrored surfaces [bottom] held micrometers apart. A portion of the light passes through both surfaces; the rest reflects within the mirrored cavity before it passes through. If the light is of the right wavelength and enters at a particular angle, it will constructively interfere with itself [left]. The result is an angle-dependent wavelength filter [right]." src="/image/MzcwNjM5Nw.jpeg"><span class="magnifier">&nbsp;</span></a> 
         1296     <figcaption class="hi-cap">
         1297       Illustration: James Provost 
         1298     </figcaption> 
         1299    </figure> </li> 
         1300  </ul> 
         1301 </aside> 
         1302 <p>Hyperspectral imaging detects a wide range of wavelengths, some of which, of course, are beyond the visible. To achieve such detection, you need both a spectrometer and an imager.</p> 
         1303 <p>The spectrometers in SCIAMACHY are based on diffraction gratings. A diffraction grating disperses the incoming light as a function of its wavelength—just as a prism spreads out the spectrum of white light into a rainbow. In space-based hyperspectral imaging systems, one dimension of the imager is used for spectral dispersion, and the other is used for spatial imaging. By imaging a narrow slit of a scene at the correct orientation, you get a spectrum at each point along that thin strip of land. As the spacecraft travels, sequential strips can be imaged to form a two-dimensional array of points, each of which has a full spectrum associated with it.</p> 
         1304 <p>If the incoming light has passed through a gas—say, Earth’s atmosphere—in a region tainted with methane, certain bands in the infrared part of that spectrum should be dimmer than otherwise in a pattern characteristic of that chemical.</p> 
         1305 <p>Such a spectral-imaging system works well, but making it compact is challenging for several reasons. One challenge is the need to minimize optical aberrations to achieve a sharp image of ground features and emission plumes. However, in remote sensing, the signal strength (and hence signal-to-noise ratio) is driven by the aperture size, and the larger this is, the more difficult it is to minimize aberrations. Adding a dispersive grating to the system leads to additional complexity in the optical system.</p> 
         1306 <p>A Fabry-Pérot etalon can be much more compact without the need for a complex imaging system, despite certain surmountable drawbacks. It is essentially two partially mirrored pieces of glass held very close together to form a reflective cavity. Imagine a beam of light of a certain wavelength entering the cavity at a slight angle through one of the mirrors. A fraction of that beam would zip across the cavity, squeak straight through the other mirror, and continue on to a lens that focuses it onto a pixel on an imager placed a short distance away. The rest of that beam of light would bounce back to the front mirror and then across to the back mirror. Again, a small fraction would pass through, the rest would continue to bounce between the mirrors, and the process would repeat. All that bouncing around adds distance to the light’s paths toward the pixel. If the light’s angle and its wavelength obey a particular relationship to the distance between the mirrors, all that light will constructively interfere with itself. Where that relation holds, a set of bright concentric rings forms. Different wavelengths and different angles would produce a different set of rings.</p> 
         1307 <p>In an imaging system with a Fabry-Pérot etalon like the ones in our satellites, the radius of the ring on the imager is roughly proportional to the ray angle. What this means for our system is that the etalon acts as an angle-dependent filter. So rather than dispersing the light by wavelength, we filter the light to specific wavelengths, depending on the light’s radial position within the scene. Since we’re looking at light transmitted through the atmosphere, we end up with dark rings at specific radii corresponding to molecular absorption lines.</p> 
         1308 <p>The etalon can be miniaturized more easily than a diffraction-grating spectrometer, because the spectral discrimination arises from interference that happens within a very small gap of tens to hundreds of micrometers; no large path lengths or beam separation is required. Furthermore, since the etalon consists of substrates that are parallel to one another, it doesn’t add significantly to aberrations, so you can use relatively straightforward optical-design techniques to obtain sufficient spatial resolution.</p> 
         1309 <p>However, there are complications associated with the WAF-P imaging spectrometer. For example, the imager behind the etalon picks up both the image of the scene (where the gas well is) and the interference pattern (the methane spectrum). That is, the spectral rings are embedded in—and corrupted by—the actual image of the patch of Earth the satellite is pointing at. So, from a single camera frame, you can’t distinguish variability in how much light reflects off the surface from changes in the amount of greenhouse gases in the atmosphere. Separating spatial and spectral information, so that we can pinpoint the origin of a methane plume, took some innovation.</p> 
         1310 <p></p> 
         1311 <p><strong>The computational process</strong> used to extract gas concentrations from spectral measurements is called a retrieval. The first step in getting this to work for the WAF-P was characterizing the instrument properly before launch. That produces a detailed model that can help predict precisely the spectral response of the system for each pixel.</p> 
         1312 <figure class="xlrg" role="img"> 
         1313  <img alt="Illustration of satellites.  " src="/image/MzcwNjQwOQ.jpeg"> 
         1314  <div class="ai"> 
         1315   <figcaption class="hi-cap">
         1316     Illustration: James Provost. Photos: GHGSat 
         1317   </figcaption> 
         1318   <figcaption> 
         1319    <strong>Putting It Together:</strong> To determine the complete spectrum of an entire scene, the satellite must take up to 200 images as it passes overhead [top]. That way each feature will be measured at all the relevant wavelengths [red rings, bottom]. The process, called a retrieval, reproduces an image of a methane plume. 
         1320   </figcaption> 
         1321  </div> 
         1322 </figure> 
         1323 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/RJgKHuM3HKw" width="620"></iframe></p> 
         1324 <p>But that’s just the beginning. Separating the etalon’s mixing of spectral and spatial information took some algorithmic magic. We overcame this issue by designing a protocol that captures a sequence of 200 overlapping images as the satellite flies over a site. At our satellite’s orbit, that means maximizing the time we have to acquire images by continuously adjusting the satellite’s orientation. In other words, we have the satellite stare at the site as it passes by, like a rubbernecking driver on a highway passing a car wreck.</p> 
         1325 <p>The next step in the retrieval procedure is to align the images, basically tracking all the ground locations within the scene through the sequence of images. This gives us a collection of up to 200 readings where a feature, say, a leaking gas well, passes across the complete interference pattern. This effectively is measuring the same spot on Earth at decreasing infrared wavelengths as that spot moves outward from the center of the image. If the methane concentration is anomalously high, this leads to small but predictable changes in signal level at specific positions on the image. Our retrievals software then compares these changes to its internal model of the system’s spectral response to extract methane levels in parts per million.</p> 
         1326 <p>At this point, the WAF-P’s drawbacks become an advantage. Some other satellites use separate instruments to visualize the ground and sense the methane or CO2 spectra. They then have to realign those two. Our system acquires both at once, so the gas plume automatically aligns with its point of origin down to the level of tens of meters. Then there’s the advantage of high spatial resolution. Other systems, such as Tropomi (TROPOspheric Monitoring Instrument, launched in 2017), must average methane density across a 7-kilometer-wide pixel. The peak concentration of a plume that Claire could spot would be so severely diluted by Tropomi’s resolution that it would seem only 1/200th as strong. So high-spatial-resolution systems like Claire can detect weaker emitters, not just pinpoint their location.</p> 
         1327 <p>Just handing a customer an image of their methane plume on a particular day is useful, but it’s not a complete picture. For weaker emitters, measurement noise can make it difficult to detect methane point sources from a single observation. But temporal averaging of multiple observations using our analytics tools reduces the noise: Even with a single satellite we can make 25 or more observations of a site per year, cloud cover permitting.</p> 
         1328 <p>Using that average, we then produce an estimation of the methane emission rate. The process takes snapshots of methane density measurements of the plume column and calculates how much methane must be leaking per hour to generate that kind of plume. Retrieving the emission rate requires knowledge of local wind conditions, because the excess methane density depends not only on the emission rate but also on how quickly the wind transports the emitted gas out of the area.</p> 
         1329 <p><strong>We’ve learned a lot</strong> in the four years since Claire started its observations. And we’ve managed to put some of those lessons into practice in our next generation of microsatellites, of which Iris is the first. The biggest lesson is to focus on methane and leave carbon dioxide for later.</p> 
         1330 <p>If methane is all we want to measure, we can adjust the design of the etalon so that it better measures methane’s corner of the infrared spectrum, instead of being broad enough to catch CO2’s as well. This, coupled with better optics that keep out extraneous light, should result in a 10-fold increase in methane sensitivity. So Iris and the satellites to follow will be able to spot smaller leaks than Claire can.</p> 
         1331 <figure class="xlrg" role="img"> 
         1332  <img alt="The heart of the wide-angle Fabry-Pérot imaging spectrometer." src="/image/MzcwNjQxMA.jpeg"> 
         1333  <div class="ai"> 
         1334   <figcaption class="hi-cap">
         1335     Photo: GHGSat 
         1336   </figcaption> 
         1337   <figcaption> 
         1338    <strong>Light Fantastic:</strong> The heart of the wide-angle Fabry-Pérot imaging spectrometer. 
         1339   </figcaption> 
         1340  </div> 
         1341 </figure> 
         1342 <p>We also discovered that our next satellites would need better radiation shielding. Radiation in orbit is a particular problem for the satellite’s imaging chip. Before launching Claire, we’d done careful calculations of how much shielding it needed, which were then balanced with the increased cost of the shielding’s weight. Nevertheless, Claire’s imager has been losing pixels more quickly than expected. (Our software partially compensates for the loss.) So Iris and the rest of the next generation sport heavier radiation shields.</p> 
         1343 <p>Another improvement involves data downloads. Claire has made about 6,000 observations in its first four years. The data is sent to Earth by radio as the satellite streaks past a single ground station in northern Canada. We don’t want future satellites to run into limits in the number of observations they make just because they don’t have enough time to download the data before their next appointment with a methane leak. So Iris is packed with more memory than Claire has, and the new microsatellite carries an experimental laser downlink in addition to its regular radio antenna. If all goes to plan, the laser should boost download speeds 1,000-fold, to 1 gigabit per second.</p> 
         1344 <p><strong>In its polar orbit,</strong> 500 kilometers above Earth, Claire passes over every part of the planet once every two weeks. With Iris, the frequency of coverage effectively doubles. And the addition in December of Hugo and three more microsatellites due to launch in 2021 will give us the ability to check in on any site on the planet almost daily—depending on cloud cover, of course.</p> 
         1345 <p>With our microsatellites’ resolution and frequency, we should be able to spot the bigger methane leaks, which make up about 70 percent of emissions. Closing off the other 30 percent will require a closer look. For example, with densely grouped facilities in a shale gas region, it may not be possible to attribute a leak to a specific facility from space. And a sizable leak detectable by satellite might be an indicator of several smaller leaks. So we have developed an aircraft-mounted version of the WAF-P instrument that can scan a site with 1-meter resolution. The first such instrument took its test flights in late 2019 and is now in commercial use monitoring a shale oil and gas site in British Columbia. Within the next year we expect to deploy a second airplane-mounted instrument and expand that service to the rest of North America.</p> 
         1346 <p>By providing our customers with fine-grained methane surveys, we’re allowing them to take the needed corrective action. Ultimately, these leaks are repaired by crews on the ground, but our approach aims to greatly reduce the need for in-person visits to facilities. And every source of fugitive emissions that is spotted and stopped represents a meaningful step toward mitigating climate change.</p> 
         1347 <p><em>This article appears in the November 2020 print issue as “Microsatellites Spot Mystery Methane Leaks.”</em></p> 
         1348 <h2>About the Author</h2> 
         1349 <p><a href="https://www.linkedin.com/in/jason-mckeever-7a4a555"><span>Jason McKeever</span></a><span>, </span><a href="https://www.linkedin.com/in/dylan-jervis-65807b24"><span>Dylan Jervis</span></a><span>, and </span><a href="https://www.linkedin.com/in/mathias-strupler-8b460121"><span>Mathias Strupler</span></a><span> are with </span><a href="https://www.ghgsat.com/"><span>GHGSat</span></a><span>, a remote-sensing company in Montreal. McKeever is the company’s science and systems lead, Jervis is a systems specialist, and Strupler is an optical systems specialist. </span></p>]]></content:encoded>
         1350       <dc:creator>By Jason McKeever</dc:creator>
         1351       <dc:creator>Dylan Jervis</dc:creator>
         1352       <dc:creator>Mathias Strupler</dc:creator>
         1353       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNTk4Nw.jpeg" />
         1354       <media:content url="https://spectrum.ieee.org/image/MzcwNTk4Nw.jpeg" />
         1355     </item>
         1356     <item>
         1357       <title>Going Carbon-Negative—Starting with Vodka</title>
         1358       <link>https://spectrum.ieee.org/podcast/energy/environment/going-carbonnegativestarting-with-vodka</link>
         1359       <description>A Brooklyn startup is an XPRIZE finalist for the way it turns CO2 into premium vodka</description>
         1360       <category>energy</category>
         1361       <category>energy/environment</category>
         1362       <pubDate>Tue, 27 Oct 2020 14:00:00 GMT</pubDate>
         1363       <guid>https://spectrum.ieee.org/podcast/energy/environment/going-carbonnegativestarting-with-vodka</guid>
         1364       <content:encoded><![CDATA[<style type="text/css">.detail-wrapper .article-detail .media-wrapper .buttons {
         1365    font-family: Theinhardt-Regular,sans-serif;
         1366    display: none;
         1367    clear: both;
         1368    padding: 10px 0;
         1369    overflow: hidden;
         1370 }
         1371 .article-detail article iframe, .article-detail iframe {
         1372     display: block;
         1373     margin: auto;
         1374     margin-bottom: 40px;
         1375 }
         1376 </style> 
         1377 <iframe frameborder="no" height="180" scrolling="no" seamless src="https://share.transistor.fm/e/08b8733c" width="100%"></iframe> 
         1378 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Hi this is Steven Cherry for Radio Spectrum.</span></p> 
         1379 <p class="MsoNormal"><span>In 2014, two Google engineers, </span><a href="/energy/renewables/what-it-would-really-take-to-reverse-climate-change"><span>writing in the pages of <em>IEEE Spectrum</em></span></a><span>, noted that “if all power plants and industrial facilities switch over to zero-carbon energy sources right now, we’ll still be left with a ruinous amount of CO<sub>2</sub> in the atmosphere. It would take centuries for atmospheric levels to return to normal, which means centuries of warming and instability.” Citing the work of climatologist James Hansen, they continued: “To bring levels down below the safety threshold, Hansen’s models show that we must not only cease emitting CO<sub>2</sub> as soon as possible but also actively remove the gas from the air and store the carbon in a stable form.”</span></p> 
         1380 <p class="MsoNormal"><span>One alternative is to grab carbon dioxide as it’s produced, and stuff it underground or elsewhere. People have been talking about CSS, which alternatively stands for carbon capture and storage, or carbon capture and sequestration, for well over a decade. But you can look around, for example at Exxon-Mobil’s </span><a href="https://corporate.exxonmobil.com/Research-and-innovation/Carbon-capture-and-storage"><span>website</span></a><span>, and see how much progress <em>hasn’t </em>been made.</span></p> 
         1381 <p class="MsoNormal"><span>In fact, in 2015, a bunch of mostly Canadian energy producers decided on a different route. They went to the XPRIZE people and funded what came to be called the </span><a href="https://carbon.xprize.org/prizes/carbon"><span>Carbon XPRIZE</span></a><span> to, as a <em>Spectrum</em> article at the time said, turn “</span><a href="/energywise/energy/fossil-fuels/carbon-polluters-fund-xprize-to-repurpose-their-emissions"><span>CO<sub>2</sub> molecules into products with higher added value</span></a><span>.”</span></p> 
         1382 <p class="MsoNormal"><span>In 2018, the XPRIZE announced 10 finalists, who divvied up a $5 million incremental prize. The prize timeline called for five teams each to begin an operational phase in two locations, one in Wyoming and the other in Alberta, culminating in a $20 million grand prize. And then the coronavirus hit, rebooting the prize timeline.</span></p> 
         1383 <p class="MsoNormal"><span>One of the more unlikely finalists emerged from the hipsterish Bushwick neighborhood of Brooklyn, N.Y. Their solution to climate change: vodka. Yes, vodka. The finalist, which calls itself the </span><a href="https://aircompany.com/"><span>Air Company</span></a><span>, takes carbon dioxide that has been liquified and distills it into ethanol, and then fine-tunes it into vodka. The resulting product is, the company claims, not only carbon-neutral but carbon negative.</span></p> 
         1384 <p class="MsoNormal"><span>The scientific half of founding duo of the Air Company is </span><a href="https://talented12.cenmag.org/staff-sheehan/"><span>Stafford Sheehan</span></a><span>—Staff, as he’s known. He had two startups under his belt by the time he graduated from Boston College. He started his next venture while in graduate school at Yale. He’s a prolific researcher but he’s determined to find commercially viable ways to reduce the carbon in the air, and he’s my guest today, via Skype.</span></p> 
         1385 <p class="MsoNormal"><span>Staff, welcome to the podcast. </span></p> 
         1386 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Thanks very much for having me. Steven. </span></p> 
         1387 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Staff, I’m sure people have been teasing you that maybe vodka doesn’t solve the problem of climate change entirely, but it can make us forget it for a while. But in serious engineering terms, the Air Company process seems a remarkable advance. Talk us through it. It starts with liquefied carbon dioxide. </span></p> 
         1388 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah, happy to. So, we use liquefied carbon dioxide because we source it offsite in in Bushwick. But really, we can just feed any sort of carbon dioxide into our system. We combine the carbon dioxide with water by first splitting the water into hydrogen and oxygen. Water is H<sub>2</sub>O, so we use what’s called an electrolyzer to split water into hydrogen gas and oxygen gas and then combine the hydrogen together with carbon dioxide in a reactor over proprietary catalysts that I and my coworkers developed over the course of the last several years. And that produces a mixture of ethanol and water that we then distill to make a very, very clean and very, very pure vodka. </span></p> 
         1389 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Your claim that the product is carbon-negative is based on a life-cycle analysis. The calculation starts with an initial minus of the amount of carbon you take out of the atmosphere. And then we start adding back the carbon and carbon equivalents needed to get it into a bottle and onto the shelf of a hipster bar. That first step where your supplier takes carbon out of the atmosphere, puts it into liquefied form and then delivers it to your distillery. That puts about 10 percent of that that carbon back into the atmosphere. </span></p> 
         1390 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah, 10 to 20 percent. When a tonne of carbon dioxide arrives in liquid form at our Bushwick facility, we assume that it took 200 kilograms of CO<sub>2</sub> emitted—not only for the capture of the carbon dioxide; most of the carbon dioxide that we get actually comes from fuel ethanol fermentation. So we take the carbon dioxide emissions of the existing ethanol industry and we’re turning that into a higher purity ethanol. But it’s captured from those facilities and then it’s liquefied and transported to our Bushwick facility. And if you integrate the lifecycle carbon emissions of all of the equipment, all the steel, all of the transportation, every part of that process, then you you get about a maximum life-cycle CO<sub>2</sub> emissions for the carbon dioxide of 200 kilograms per ton. So we still have eight hundred kilograms to play with at our facility. </span></p> 
         1391 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>So another 10 percent gets eaten up by that electrolysis process.</span></p> 
         1392 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah. The electrolysis process is highly dependent on what sort of electricity you use to power it with. We use a company called Clean Choice. And we’re we work very closely with a number of solar and wind deployers in New York State to make sure that all the electricity that’s used at our facility is solar or wind. And if you use wind energy, that’s the most carbon-friendly energy source that we have available there. Right now, the mix that we have, which is certified through Con Edison, is actually very heavily wind and a little bit of solar. But that was the lowest lifecycle-intensity electricity that we could get. So we get ... it’s actually a little bit less than 10 percent of that is consumed by electrolysis. So the electrolysis is actually quite green as long as you power it with a very low-carbon source of electricity. </span></p> 
         1393 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>And the distilling process, even though it’s solar-based, takes maybe another 13 percent or so? </span></p> 
         1394 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>It’s in that ballpark. The distilling process is powered by an electric steam boiler. So we use the same electricity that we use to split water, to heat our water for the distillation system. So we have a fully electric distillery process. You could say that we’ve electrified vodka distilling. </span></p> 
         1395 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>There’s presumably a bit more by way of carbon equivalents when it comes to the bottles the vodka comes in, shipping it to customers, and so on, but that’s true of any vodka that ends up on that shelf of any bar, and those also have a carbon-emitting farming process—whether it’s potatoes or sugar beets or wheat or whatever—that </span><span>your process sidesteps. </span></p> 
         1396 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yes. And I think one thing that’s really important is, this electrification act aspect by electrifying or all of our distillery processes, for example, if you’re boiling water using a natural gas boiler, your carbon emissions are going to be much, much higher as compared to boiling water using an electric steam boiler that’s powered with wind energy. </span></p> 
         1397 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>It seems like if you just poured the vodka down the drain or into the East River, you would be benefiting the environment. I mean, would it be possible to do that on an industrial scale as a form of carbon capture and storage that really works? </span></p> 
         1398 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah. I don’t think you’d want to pour good alcohol down the drain in any capacity just because the alcohol that we make can offset the use of fossil fuel alcohol. </span></p> 
         1399 <p class="MsoNormal"><span>So by putting the alcohol that we make—this carbon negative alcohol that we make—into the market, that means you have to make less fossil alcohol. And I’m including corn ethanol in that because so many fossil fuels go into its production. But that makes it so that our indirect CO<sub>2</sub>, our indirect CO<sub>2</sub> utilization is very, very high because we’re offsetting a very carbon-intensive product. </span></p> 
         1400 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>That’s interesting. I was thinking that maybe you could earn carbon credits and sell them for more than you might make with having a, you know, another pricey competitor to Grey Goose and Ketel One. </span></p> 
         1401 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>The carbon credit, the carbon credit system is still very young, especially in the US. </span></p> 
         1402 <p class="MsoNormal"><span>We also … our technology still has a ways to scale between our Bushwick facility—which is, I would say, a micro distillery—and a real bona industrial process, which … we’re working on that right now. </span></p> 
         1403 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Speaking of which, though, it is rather pricey stuff at this point, isn’t it? Did I read $65 or $70 a bottle? </span></p> 
         1404 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah, it’s pricey not only because you pay a premium for our electricity, for renewable electricity, but we also pay a premium for carbon dioxide that, you know, has that that only emits 10 to 20 percent of the carbon intensity of its actual weight, so we pay a lot more for the inputs than is typical—sustainability costs money—and also we’re building these systems, they’re R&amp;D systems, and so they’re <span>&nbsp;</span>more costly to operate on a R&amp;D scale, on kind of our pilot plant scale. As we scale up, the cost will go down. But at the scales we’re at right now, we need to be able to sell a premium product to be able to have a viable business. Now, on top of that, the product is also won a lot of awards that put it in that price category. It’s won three gold medals in the three most prestigious blind taste test competitions. And it’s won a lot of other spirits and design industry awards that enable us to get that sort of cost for it. </span></p> 
         1405 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>I’m eager to do my own blind taste testing. Vodka is typically 80 proof, meaning it’s 60 percent water. You and your co-founder went on an epic search for just the right water. </span></p> 
         1406 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>That we did. We tested over ... probably over one hundred and thirty different types of water. We tried to find which one was best to make vodka with using the very, very highly pure ethanol that comes out of our process. And it’s a very nuanced thing. Water, by changing things like the mineral content, the pH, by changing the very, very small trace impurities in the water—that in many cases are good for you—can really change the way the water feels in your mouth and the way that it tastes. And adding alcohol to water just really amplifies that. It lowers the boiling point and it makes it more volatile so that it feels different in your mouth. And so different types of water have a different mouth feel; they have a different taste. We did a lot of research on water to be able to find the right one to mix with our vodka. </span></p> 
         1407 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Did you end up where you started with New York water? </span></p> 
         1408 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yes. In in a in a sense, we are we’re very, very close to where we started. </span></p> 
         1409 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>I guess we have to add your vodka to the list that New Yorkers would claim includes New York’s bagels and New York’s pizza as uniquely good, because if their water. </span></p> 
         1410 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Bagels, pizza, vodka ... hand sanitizer ... </span></p> 
         1411 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>It’s a well-balanced diet. So where do things stand with the XPRIZE? I gather you finally made it to Canada for this operational round, but take us through the journey getting there. </span></p> 
         1412 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>So I initially entered the XPRIZE when it was soliciting for very first submissions—I believe it was 2016—and going through the different stages, we had at the end of 2017, we had very rigorous due diligence on our prototype scale. And we passed through that and got good marks and continuously progressed through to the finals where we are now. Now, of course, coronavirus kind of threw both our team and many other teams for a loop, delaying deployment, especially for us: We’re the only American team deploying in Canada. The other four teams that are deploying at the ACCTC [</span><span>Alberta&nbsp;Carbon&nbsp;Conversion Technology Centre</span><span>] are all Canadian teams. So being the only international team in a time of a global pandemic that, you know, essentially halted all international travel—and a lot of international commerce—put some substantial barriers in our way. But over the course of the last seven months or so, we’ve been able to get back on our feet. And I’m currently sitting in quarantine in Mississauga, Ontario, getting ready for a factory-acceptance test. That’s scheduled to happen right at the same time as quarantine ends. So we’re gonna be at the end of this month landing our skid in Alberta for the finals and then in November, going through diligence and everything else to prove out its operation and then operating it through the rest of the year. </span></p> 
         1413 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>I understand that you weren’t one of the original 10 finalists named in 2018. </span></p> 
         1414 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>No, we were not. We were the runner-up. There was a runner-up for each track—the Wyoming track and the Alberta track. And ultimately, there were teams that dropped out or merged for reasons within their own businesses. We were given the opportunity to rejoin the competition. We decided to take it because it was a good proving ground for our next step of scale, and it provided a lot of infrastructure that allowed us to do that at a reasonable cost—at a reasonable cost for us and at a reasonable cost in terms of our time. </span></p> 
         1415 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Staff, you were previously a co-founder of a startup called Catalytic Innovations. In fact, you were a 2016 Forbes magazine, 30-under-30 because of it. What was it? And is it? And how did it lead to Air Company and vodka? </span></p> 
         1416 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>For sure. That was a company that I spun out of Yale University, along with a professor at Yale, </span><a href="https://environment.yale.edu/profile/anastas"><span>Paul Anastas</span></a><span>. We initially targeted making new catalysts for fuel cell and electrolysis industries, focusing around the water oxidation reaction. So to turn carbon dioxide—or to produce fuel in general using renewable electricity—there are three major things that need to happen. You need to have a very efficient renewable energy source. Trees, for example, use the sun. That’s photosynthesis. You have to be able to oxidize water into oxygen gas. And that’s why trees breathe out oxygen. And you have to be able to use the protons and electrons that come out of water oxidation to either reduce carbon dioxide or through some other method, produce a fuel. So I studied all three of those when I was in graduate school, and upon graduating, I spun out Catalytic Innovations that focused on the water oxidation reaction and commercializing materials that more efficiently produced oxygen for all of The man-made processes such as metal refining that do that chemistry. And that company found its niche in corrosion—anti-corrosion and corrosion protection—because one of the big challenges, whenever you’re producing oxygen, be it for renewable fuels or be it to produce zinc or to do a handful of different electrorefining and electrowinning processes in the metal industry. You always have a very serious corrosion problem. Did a lot of work in that industry in Catalytic Innovations, and they still continue to do work there, to this day. </span></p> 
         1417 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>You and your current co-founder, Greg Constantine, are a classic match—a technologist, in this case an electrochemist and a marketer. If this were a movie, you would have met in a bar drinking vodka. And I understand you actually did meet at a bar. Were you drinking vodka? </span></p> 
         1418 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>No, we were actually drinking whiskey. So I didn’t ... I actually I’m not a big fan of vodka pre-Air Company, but it was the product that really gave us the best value proposition where really, really clean, highly pure ethanol is most important. So I’ve always been more of a whiskey man myself, and Greg and I met over whiskey in Israel when we were on a trip that was for Forbes. You know, they sent us out there because we were both part of their 30-Under-30 list and we became really good friends out there. And then several months later, fast forward, we started Air Company. </span></p> 
         1419 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Air Company’s charter makes it look like you would like to go far beyond vodka when it comes to finding useful things to do with CO<sub>2</sub>. In the very near term, you turned to using your alcohol in a way that contributes to our safety. </span></p> 
         1420 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah. So we we had always planned the air company, not the air vodka company. We had always planned to go into several different verticals with ultra-high-purity ethanol that we create. And spirits is one of the places where you can realize the value proposition of a very clean and highly pure alcohol, very readily—spirits, fragrance is another one. But down the list a little bit is sanitizer, specifically hand sanitizer. And when coronavirus hit, we actually pivoted all of our technology because there was a really, really major shortage of sanitizer in New York City. A lot of my friends from graduate school that had kind of gone more on the medical track were telling me that the hospitals that they worked in, in New York didn’t have any hand sanitizer. And when the hospitals—for the nurses and doctors—ran out of hand sanitizer, that means you really have a shortage. And so we pivoted all of our technology to produce sanitizer in March. And for three months after that, we gave it away. We donated it to these hospitals, to the fire department, to NYPD and to other organizations in the city that needed it most. </span></p> 
         1421 <p class="MsoNormal"><span>Yeah, the hand sanitizer, I like to think, is also a very premium product. You can’t realize the benefits of the very, very clean and pure ethanol that we use for it as readily as you can with the bad guys since you’re not tasting it. But we did have to go through all of the facility registrations and that sort of thing to make the sanitizer because it is classified as a drug. So our pilot plant in and in Bushwick, which was a converted warehouse, I used to tell people in March that I always knew my future was going to be sitting in a dark warehouse in Bushwick making drugs. But, you know, never thought that it was actually going to become a reality. </span></p> 
         1422 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>That was in the short term. By now, you can get sanitizer in every supermarket and Home Depot. What are the longer-term prospects for going beyond vodka? </span></p> 
         1423 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Longer term, we’re looking at commodity chemicals, even going on to fuel. So longer term, we’re looking at the other verticals where we can take advantage of the high-purity value proposition of our ethanol—like pharmaceuticals, as a chemical feedstock, things like that. But then as we scale, we want to be able to make renewable fuel as well from this and renewable chemicals. Ultimately, we want to we want to get to world scale with this technology, but we need to take the appropriate steps to get there. And what we’re doing now are the stepping-stones to scaling it. </span></p> 
         1424 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>It seems like if you could locate the distilling operation right at the ethanol plant, you would just be making more ethanol for them with their waste product, avoid a lot of shipping and so forth. It, you would just become of value add to their industry. </span></p> 
         1425 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>That is something that we hope to do in the long term. You know what, our current skids are fairly small scale where we couldn’t take a massive amount of CO<sub>2</sub> with them. But as we scale, we do hope to get there gradually when we get to larger scales, like talking about several barrels per day rather than liters per hour, which is the scale we’re at now. </span></p> 
         1426 <p class="MsoNormal"><span>A lot of stuff you can turn CO<sub>2</sub> into. One of the prime examples is calcium carbonate. C03-[[minus]] CO<sub>2</sub> is CO<sub>2</sub>. You can very easily convert carbon dioxide into things like that for building materials. So pour concrete for different parts of bricks and things like that. There are a lot of different ways to mineralized CO<sub>2</sub> as well. Like you can inject it into the ground. That will also turn it into carbon-based minerals. Beyond that, as far as more complex chemical conversion goes, the list is almost endless. You can make plastics. You can make pharmaceutical materials. You can make all sorts of crazy stuff from CO<sub>2</sub>. Almost any of the base chemicals that have carbon in them can come from CO<sub>2</sub>. And in a way, they do come from CO<sub>2</sub> because all the petrochemicals that we mine from the ground, that they’re from photosynthesis that happened over the course of the last two billion years. </span></p> 
         1427 <p class="MsoNormal"><span>Have you ever seen the movie Forest Gump? There’s a part in that where Bubba, Gump’s buddy in the Vietnam War, talks about all the things you can do with shrimp. And it kind of goes on and on and on. But I could say the same about CO<sub>2</sub>. You can make plastic. You can make clothes. You can make sneakers. You can make alcohol. You can make any sort of chemical carbon-based ethylene, carbon monoxide, formic acid, methanol, ethanol. And there ... The list goes on. Just about any carbon-based chemical you can think of. You can make from CO<sub>2</sub>. </span></p> 
         1428 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Would it be possible to pull carbon dioxide out of a plastic itself and thereby solve two problems at once? </span></p> 
         1429 <p class="MsoNormal"><span>Yeah, you could you could take plastic and capture the CO<sub>2</sub> that’s emitted when you either incinerate it or where you gasify it. That is a strategy that’s used in certain places, gasification of waste, municipal waste. It doesn’t give you CO<sub>2</sub>, but it actually gives you something that you can do chemistry with a little more easily. It gives you a </span><a href="http://biofuel.org.uk/what-is-syngas.html"><span>syngas</span></a><span>—a mixture of carbon monoxide and hydrogen. So, there are a lot of different strategies that you can use to convert CO<sub>2</sub> into things better for the planet than global warming. </span></p> 
         1430 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>If hydrogen is a byproduct of that, you have a ready use for it. </span></p> 
         1431 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Yeah, exactly, that is one of the many places where we could source feedstock materials for our process. Our process is versatile and that’s one of the big advantages to it. </span></p> 
         1432 <p class="MsoNormal"><span>If we get hydrogen, as a byproduct of chloralkali production, for example, we can use that instead of having to source the electrolyzer. If our CO<sub>2</sub> comes from direct air capture, we can use that. And that means we can place our plants pretty much wherever there’s literally air, water and sunlight. As far as the products that come out, liquid products that are made from CO<sub>2</sub> have a big advantage in that they can be transported and they’re not as volatile, obviously, as the gases. </span></p> 
         1433 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>Well, Staff, it’s a remarkable story, one that certainly earns you that XPRIZE finalist berth. We wish you great luck with it. But it seems like your good fortune is self-made and assured, in any event to the benefit of the planet. Thank you for joining us today. </span></p> 
         1434 <p class="MsoNormal"><strong><span>Stafford Sheehan </span></strong><span>Thanks very much for having me, Steven. </span></p> 
         1435 <p class="MsoNormal"><strong><span>Steven Cherry </span></strong><span>We’ve been speaking with Staff Sheehan, co-founder of the Air Company, a Brooklyn startup working to actively undo the toxic effects of global warming.</span></p> 
         1436 <p class="MsoNormal"><span>This interview was recorded October 2, 2020. Our thanks to Miles of Gotham Podcast Studio for our audio engineering; our </span><a href="https://www.youtube.com/watch?v=x6i8iQ1c0MM"><span>music</span></a><span> is by </span><a href="https://freemusicarchive.org/music/Chad_Crouch"><span>Chad Crouch</span></a><span>.</span></p> 
         1437 <p class="MsoNormal"><span>Radio Spectrum is brought to you by <em>IEEE Spectrum</em>, the member magazine of the Institute of Electrical and Electronic Engineers. </span></p> 
         1438 <p class="MsoNormal"><span>For Radio Spectrum, I’m </span><a href="mailto:metaphor@ieee.org"><span>Steven Cherry</span></a><span>.</span></p> 
         1439 <p class="MsoNormal"><span>&nbsp;</span></p> 
         1440 <p class="MsoNormal"><em><span>Note: Transcripts are created for the convenience of our readers and listeners. The authoritative record of IEEE Spectrum’s audio programming is the audio version.</span></em></p> 
         1441 <p class="MsoNormal"><em><span>We welcome your comments on Twitter (</span></em><a href="https://twitter.com/radiospectrum1"><span>@RadioSpectrum1</span></a><em><span> and </span></em><a href="https://twitter.com/IEEESpectrum"><span>@IEEESpectrum</span></a><em><span>) and </span></em><a href="https://www.facebook.com/IEEE.Spectrum"><span>Facebook</span></a><em><span>.</span></em></p> 
         1442 <p class="MsoNormal"><span>&nbsp;</span></p>]]></content:encoded>
         1443       <dc:creator>Steven Cherry</dc:creator>
         1444       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwOTYxNQ.jpeg" />
         1445       <media:content url="https://spectrum.ieee.org/image/MzcwOTYxNQ.jpeg" />
         1446     </item>
         1447     <item>
         1448       <title>IROS Robotics Conference Is Online Now and Completely Free</title>
         1449       <link>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/iros-2020-online</link>
         1450       <description>Join 13,000 online attendees for the world’s largest virtual robotics conference</description>
         1451       <category>robotics</category>
         1452       <category>robotics/robotics-hardware</category>
         1453       <pubDate>Mon, 26 Oct 2020 21:00:00 GMT</pubDate>
         1454       <guid>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/iros-2020-online</guid>
         1455       <content:encoded><![CDATA[<p>The 2020 International Conference on Intelligent Robots and Systems (IROS) was originally going to be held in Las Vegas this week. Like ICRA last spring, IROS has transitioned to a completely online conference, which is wonderful news: Now everyone everywhere can participate in IROS without having to spend a dime on travel.</p> 
         1456 <p>IROS officially opened yesterday, and the best news is that registration is entirely free! We’ll take a quick look at what IROS has on offer this year, which includes some stuff that’s brand news to IROS.</p> 
         1457 <!--nextpage--> 
         1458 <p><a href="https://www.iros2020.org/ondemand/signup">Registration for IROS is super easy</a>, and did we mention that it’s free? To register, just go <a href="https://www.iros2020.org/ondemand/signup">here</a> and fill out a quick and easy form. You don’t even have to be an IEEE Member or anything like that, although in our unbiased opinion, an <a href="https://www.ieee.org/membership-application/public/join.html?grade=Member&amp;promo=SPECJOIN">IEEE membership is well worth it</a>. Once you get the confirmation email, go to <a href="https://www.iros2020.org/ondemand/">https://www.iros2020.org/ondemand/</a>, put in the email address you used to register, and that’s it, you’ve got IROS!</p> 
         1459 <p>Here are some highlights:</p> 
         1460 <h3>Plenaries and Keynotes</h3> 
         1461 <p>Without the normal space and time constraints, you won’t have to pick and choose between any of the three plenaries or 10 keynotes. Some of them are fancier than others, but we’re used to that sort of thing by now. It’s worth noting that all three plenaries (and three of the 10 keynotes) are given by extraordinarily talented women, which is excellent to see.</p> 
         1462 <h3>Technical Tracks</h3> 
         1463 <p>There are over 1,400 technical talks, divided up into 12 categories of 20 sessions each. Note that each of the 12 categories that you see on the main page can be scrolled through to show all 20 of the sessions; if there’s a bright red arrow pointing left or right you can scroll, and if the arrow is transparent, you’ve reached the end.</p> 
         1464 <p>On the session page, you’ll see an autoplaying advertisement (that you can mute but not stop), below which each talk has a preview slide, a link to a ~15 minute presentation video, and another link to a PDF of the paper. No supplementary videos are available, which is a bit disappointing. While you can leave a comment on the video, there’s no way of interacting with the author(s) directly through the IROS site, so you’ll have to check the paper for an email address if you want to ask a question.</p> 
         1465 <h3>Award Finalists</h3> 
         1466 <p>IROS has thoughtfully grouped all of the paper award finalists together into nine sessions. These are some truly outstanding papers, and it’s worth watching these sessions even if you’re not interested in specific subject matter.</p> 
         1467 <h3>Workshops and Tutorials</h3> 
         1468 <p>This stuff is a little more impacted by asynchronicity and on-demandedness, and some of the workshops and tutorials have already taken place. But IROS has done a good job at collecting videos of everything and making them easy to access, and the dedicated websites for the workshops and tutorials themselves sometimes have more detailed info. If you’re having trouble finding where the workshops and tutorial section is, try the “Entrance” drop-down menu up at the top.</p> 
         1469 <h3>IROS Original Series</h3> 
         1470 <p>In place of social events and lab tours, IROS this year has come up with the “IROS Original Series,” which “hosts unique content that would be difficult to see at in-person events.” Right now, there are some interviews with a diverse group of interesting roboticists, and hopefully more will show up later on.</p> 
         1471 <h3>Enjoy!</h3> 
         1472 <p>Everything on the IROS On-Demand site should be available for at least the next month, so there’s no need to try and watch a thousand presentations over three days (which is what we normally have to do). So, relax, and enjoy yourself a bit by browsing all the options. And additional content will be made available over the next several weeks, so make sure to check back often to see what’s new.</p> 
         1473 <p>[ <a href="https://www.iros2020.org/">IROS 2020</a> ]</p>]]></content:encoded>
         1474       <dc:creator>Evan Ackerman</dc:creator>
         1475       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwOTMxNA.jpeg" />
         1476       <media:content url="https://spectrum.ieee.org/image/MzcwOTMxNA.jpeg" />
         1477     </item>
         1478     <item>
         1479       <title>MIT Unveils a Roboat Big Enough to Stand on</title>
         1480       <link>https://spectrum.ieee.org/cars-that-think/transportation/marine/mit-unveils-a-roboat-big-enough-to-stand-on</link>
         1481       <description>A bunch of these boats could meet up to form ad hoc structures, like a bridge or a platform.</description>
         1482       <category>transportation</category>
         1483       <category>transportation/marine</category>
         1484       <pubDate>Mon, 26 Oct 2020 19:00:00 GMT</pubDate>
         1485       <guid>https://spectrum.ieee.org/cars-that-think/transportation/marine/mit-unveils-a-roboat-big-enough-to-stand-on</guid>
         1486       <content:encoded><![CDATA[<p>An early idea&nbsp;for making the most of autonomous technology was to put a human-driven lead car in front of a string of robocars. It’s called platooning, and it looks for all the world like a mama goose leading a gaggle of goslings.</p> 
         1487 <p>You can play “follow the leader” on the water, too, but because boats can easily touch and move in tandem, you can have much more complex arrangements than simple caravans. The coordination between the lead boats and the followers allows&nbsp;you to go lighter on sensors and other hardware when designing those follower boats, which can rely on the lead boat to sense the wider environment. This all means that&nbsp;small boats can form and reform in a variety of ways—“shapeshifting” into useful&nbsp;structures like&nbsp;a bridge or a platform. Presto! You’ve got yourself a lilypad fit&nbsp;to host&nbsp;a popup event on a canal or lake—say, a flower show&nbsp;or a concert.</p> 
         1488 <p>“You could create on demand whatever is needed to shift human activities to the water,” <a href="https://www.csail.mit.edu/person/daniela-rus">Daniela Rus</a>, a professor of electrical engineer and computer science at MIT tells <em>IEEE Spectrum</em>. She is one of the leaders of a project jointly run by the university’s <a href="https://www.csail.mit.edu/">Computer Science and Artificial Intelligence Laboratory</a> (CSAIL) and its <a href="http://senseable.mit.edu/">Senseable City Lab</a>, to explore autonomous boats.</p> 
         1489 <p>Five years ago the project began by moving <a href="/automaton/robotics/humanoids/video-friday-emys-expressive-robot-head-darpa-luke-arm-cyborg-moth">meter-long boats</a> around in pools and in a canal; now it is graduating to bigger boats.&nbsp;“Roboat II” measures 2 meters, which MIT calls “Covid-friendly” because&nbsp;it is sufficient for social isolation between passengers. It is being tested on Boston’s Charles River, and it has also braved the canals of Amsterdam, where it steered itself around for three hours and returned to within 17 centimeters (6.7 inches) of its origin. A full-size model, at 4 meters, is being built by the <a href="https://www.ams-institute.org/">Amsterdam Institute for Advanced Metropolitan Solutions</a>, for testing in Amsterdam’s canals.</p> 
         1490 <p>The latest developments are described in a paper that is being presented today virtually, at the <a href="https://www.iros2020.org/">International Conference on Intelligent Robots and Systems</a>. The lead author is <a href="https://scholar.google.com/citations?user=vdBpX4cAAAAJ&amp;hl=en">Wei Wang</a>, a postdoctoral fellow at MIT; Rus is among the co-authors.</p> 
         1491 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/OYmVwvP_pD0?rel=0" width="620"></iframe></p> 
         1492 <p>“In order to be really accurate and to get situational awareness you have to put a rich set of sensors on every single node in the system—every single boat,” Rus says. “With the leader-follower arrangement you don't have to have that. If you have a large swarm of vehicles, do they all need to be endowed with all the sensors?”</p> 
         1493 <p>MIT’s smallest robo-boats haven’t made the jump from the lab to the marketplace, although they can in principle be used for mapping, water-pollution testing, and other highly localized work. The real promise lies in bigger boats, with their greater carrying capacity and longer run times between charges.</p> 
         1494 <p>Autonomous charging is possible, though, for boats large or small. They can always plug themselves in, much as a Roomba vacuum cleaner does.&nbsp;<span>Full autonomy is the ultimate goal, but there’s a lot that can be done even now. Just follow the leader.</span></p>]]></content:encoded>
         1495       <dc:creator>Philip E. Ross</dc:creator>
         1496       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODE5NA.jpeg" />
         1497       <media:content url="https://spectrum.ieee.org/image/MzcwODE5NA.jpeg" />
         1498     </item>
         1499     <item>
         1500       <title>Conductorless Orchestra Helps EE Students Fine Tune Their Professional Skills</title>
         1501       <link>https://spectrum.ieee.org/the-institute/ieee-member-news/conductorless-orchestra-helps-ee-students-fine-tune-their-professional-skills</link>
         1502       <description>Developed by an Olin College engineering professor, the program teaches students about leadership, teamwork, and communication</description>
         1503       <category>the-institute</category>
         1504       <category>the-institute/ieee-member-news</category>
         1505       <pubDate>Mon, 26 Oct 2020 18:00:00 GMT</pubDate>
         1506       <guid>https://spectrum.ieee.org/the-institute/ieee-member-news/conductorless-orchestra-helps-ee-students-fine-tune-their-professional-skills</guid>
         1507       <content:encoded><![CDATA[<figure class="xlrg" role="img"> 
         1508  <img alt="Dabby Interacting with the orchestra during a OCO dress rehearsal, 2015." src="/image/MzcwODkyNw.jpeg"> 
         1509  <figcaption class="hi-cap">
         1510    Photo: Alexander Budnitz 
         1511  </figcaption> 
         1512  <figcaption>
         1513    Diana Dabby, creator of the Olin Conductorless Orchestra, interacting with the players during a dress rehearsal in 2015. 
         1514  </figcaption> 
         1515 </figure> 
         1516 <style type="text/css">.entry-content .tisubhead {
         1517     color: #999999;
         1518     font-family: verdana;
         1519     font-size: 14px;
         1520     font-weight: bold;
         1521     letter-spacing: 1px;
         1522     margin-bottom: -5px !important;
         1523     text-transform: uppercase;
         1524 }
         1525 .tiopener {
         1526     color: #0f4994;
         1527     font-family: Theinhardt-Medium, sans-serif;
         1528   letter-spacing: 1px;
         1529   margin-right: 10px;
         1530     font-weight: bold;
         1531     text-transform: uppercase;
         1532 }
         1533 </style> 
         1534 <p><span class="tiopener">THE INSTITUTE </span><a href="https://www.olin.edu/faculty/profile/diana-dabby/">Diana Dabby</a> grew up surrounded by music—both her parents were pianists. The IEEE member followed in their footsteps and earned a bachelor’s degree in music from <a href="https://www.vassar.edu/">Vassar College</a>, in Poughkeepsie, N.Y. After graduating, she moved to New York City and worked as a pianist, performing at venues including <a href="https://www.kaufmanmusiccenter.org/mch/">Merkin Hall</a> and <a href="https://www.carnegiehall.org/events/weill-recital-hall">Weill Recital Hall</a>.</p> 
         1535 <p>Although Dabby was passionate about music, she had an unsettling feeling that something was missing. That something turned out to be engineering—which she discovered after she read journal articles about engineering’s relationship to music. She decided to pursue a graduate degree in the field.</p> 
         1536 <p>After earning a doctorate in electrical engineering from <a href="https://www.mit.edu/">MIT</a>, Dabby became an engineering and music professor. She taught at <a href="https://www.tufts.edu/">Tufts University</a>, MIT, and <a href="https://www.juilliard.edu/">The Juilliard School</a>. She also continued to play concerts, performing at <a href="https://necmusic.edu/facilities">Jordan Hall</a>, <a href="https://www.bso.org/brands/tanglewood/general-info.aspx">Tanglewood</a>, and other venues in Massachusetts.</p> 
         1537 <p>In 2000, Dabby joined the <a href="https://www.olin.edu/">Olin College of Engineering</a>, in Needham, Mass., where she was one of 12 founding faculty members. In 2002 she established the <a href="https://meet.olin.edu/olin-isms/olin-conductorless-orchestra-oco">Olin Conductorless Orchestra</a> (OCO), which completed its 19th season this year. No conductor leads the orchestra; instead, the students work together to perfect their performances. The program is designed to give talented engineering students an expressive outlet while also helping them develop professional skills such as leadership, teamwork, and communication.</p> 
         1538 <p>Last year Dabby won a Best Paper Award from the <a href="https://www.asee.org/">American Society for Engineering Education</a>. Her winning paper—“<a href="https://peer.asee.org/the-engineers-orchestra-a-conductorless-orchestra-for-developing-21st-century-professional-skills.pdf"><em>The Engineers’ Orchestra: A Conductorless Orchestra for Developing 21st-Century Professional Skills</em></a>”—describes the program’s benefits.</p> 
         1539 <h3 class="tisubhead">TAKING A RISK</h3> 
         1540 <p>Dabby says music has always been an extension of herself, and she enjoyed the focus and expressivity that came with preparing for her concerts.</p> 
         1541 <p>Performing “just kept accentuating and improving my musicianship, and I loved that process,” she says. “The idea of reaching one’s full potential was very powerful to me.”</p> 
         1542 <p>She says she enjoyed taking risks in order to achieve her goal of bettering her skills as a musician.</p> 
         1543 <p>“I built up a very strong track record with taking risks,” she says, “whether during a performance or in my professional life.”</p> 
         1544 <p>And taking a risk is exactly what Dabby did after she came across an engineering journal at the <a href="http://www.nypl.org/locations/lpa">New York Public Library for the Performing Arts</a>. The journal contained articles by engineers whose avocation was music, and they inspired Dabby to ask: “What if a professional musician, one of my colleagues, or I acquired the tools of an engineer? Would we invent something new for music in our own time?”</p> 
         1545 <p>That idea pushed her to pursue a graduate degree in engineering while working as a performer and freelancer.</p> 
         1546 <p>In order to apply to graduate programs, she had to supplement her music bachelor’s degree with postbaccalaureate classes.</p> 
         1547 <p>“I had to [earn] around 127 credits because I had no math or science background,” Dabby says. She did so at the <a href="https://www.ccny.cuny.edu/">City College of New York</a>.</p> 
         1548 <p>“I retaught myself algebra and discovered that I loved it,” she says. “Engineering became this wonderful respite from performing. The engineering felt fresh. The music felt fresh.”</p> 
         1549 <p>After Dabby completed the credits she needed, she was accepted to MIT. For her doctoral thesis, she merged engineering and music. She devised a chaotic mapping tool—a representation of chaotic behavior that is typically used in mathematics—that could be used to make musical variations. The variations, which could be either changes in pitch or in the rhythmic sequence of a piece, could be close to the original work or mutate almost beyond recognition.</p> 
         1550 <p>Dabby has been granted four U.S. patents for her work.</p> 
         1551 <p>She says she wanted to “come up with something for music in the 21st century that wouldn’t necessarily occur to those who were not performers or professional musicians.”</p> 
         1552 <h3 class="tisubhead">CONDUCTORLESS ORCHESTRA</h3> 
         1553 <p>In fall 2000, when the Olin College of Engineering assembled a leadership team and faculty to begin from scratch, it paid attention to a list of skills the U.S. <a href="https://www.nae.edu/">National Academy of Engineering</a> wanted in engineering students. The list included leadership skills, effective communication, and the ability to work as part a team. The Olin faculty members brainstormed how they could help their students develop the skills, and that’s when the OCO was born.</p> 
         1554 <p>The idea “just popped into my head in our first meeting,” Dabby says. “I thought, Oh my gosh, this could mean a conductorless orchestra. Everyone leads, and everyone follows.”</p> 
         1555 <p>The students learn how to collaborate with one another and how to communicate effectively. The musicians learn to watch one another to ensure everyone starts and ends together, as well as adjust balance, dynamic levels, and tempo by listening intently and cueing one another, Dabby says.</p> 
         1556 <p>“It requires the musicians to actively listen to their parts within the context of a larger whole and adjust accordingly,” she wrote in her chapter of the book <a href="https://www.amazon.com/Creative-Knowing-Engineering-Diana-Bairaktarova/dp/3319493515"><em>Creative Ways of Knowing Engineering</em></a>. The chapter describes the OCO.</p> 
         1557 <p>Olin had only 75 students in its first year, and the first conductorless orchestra was composed of five engineering students, with Dabby at the piano. These days there are between 12 and 22 students, all selected by audition, in the OCO.</p> 
         1558 <p>The students select a piece to play, and Dabby creates an arrangement, adjusting the piece according to the instruments the students play.</p> 
         1559 <p>Each year, the musicians elect two to four navigators, who work with Dabby to ensure rehearsals run smoothly and communication lines remain open within the group. Together, along with two rehearsal leaders, they come up with the agenda for that week’s rehearsal.</p> 
         1560 <p>During rehearsals, orchestra members can share their thoughts regarding the different interpretations of the piece the group chose to play. The members play each interpretation, and the orchestra votes on which version it wants to perform.</p> 
         1561 <p>All involved in the OCO learn how to listen, when to speak, and when to refrain from sharing their thoughts.</p> 
         1562 <p>“Employers see the Olin Conductorless Orchestra on résumés and they’re curious,” Dabby says. “It’s actually helped students get jobs.”</p> 
         1563 <p>The program also has helped students during their time at the college.</p> 
         1564 <p>“It’s a stress-reliever,” Dabby says. The OCO “gives [students] balance in their lives.”</p> 
         1565 <p>The orchestra performs at school functions and travels once a year to play at other venues. Last year it received a standing ovation after performing at the <a href="https://publons.com/journal/111069/zone-1-conference-of-the-american-society-for-engi/">American Society for Engineering Education Zone 1 International Conference</a>, Dabby says.</p> 
         1566 <p>“There’s always an upcoming performance, and it’s another chance for students to raise the bar,” she says. “For students, it’s a challenge and a neat way to become better while doing something they love.”</p>]]></content:encoded>
         1567       <dc:creator>Joanna Goodrich</dc:creator>
         1568       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODg5Mw.jpeg" />
         1569       <media:content url="https://spectrum.ieee.org/image/MzcwODg5Mw.jpeg" />
         1570     </item>
         1571     <item>
         1572       <title>No Implants Needed For Precise Control Deep Into The Brain</title>
         1573       <link>https://spectrum.ieee.org/the-human-os/biomedical/devices/deep-brain-control-without-implants</link>
         1574       <description>Optogenetics can now control neural circuits at unprecedented depths within living brain tissue without surgery</description>
         1575       <category>biomedical</category>
         1576       <category>biomedical/devices</category>
         1577       <pubDate>Mon, 26 Oct 2020 16:30:00 GMT</pubDate>
         1578       <guid>https://spectrum.ieee.org/the-human-os/biomedical/devices/deep-brain-control-without-implants</guid>
         1579       <content:encoded><![CDATA[<p>The first time Karl Deisseroth used light to control brain cells in a dish, people had a lot of questions, three in particular. <em>Can the technique be used in living animals? Can it target different cell types? Can it work without implanting a light source into the brain?</em></p> 
         1580 <p></p> 
         1581 <p>In the years since that initial groundbreaking 2004 <a href="https://www.nature.com/articles/nn1525">experiment</a>, Deisseroth’s team and others found the answers to the first two questions: <a href="https://web.stanford.edu/group/dlab/media/papers/Adamantidis%20Nature%202007.pdf">yes</a> and <a href="https://web.stanford.edu/group/dlab/media/papers/Sohal%20Nature%202009.pdf">yes</a>. This month&nbsp;they answered the third question with another yes,&nbsp;successfully introducing&nbsp;an implant-free version of the technique. It is the first demonstration that optogenetics—which uses a combination of light and genetic engineering to control brain cells—can accurately switch the cells on and off without surgery.</p> 
         1582 <p></p> 
         1583 <p>“This is kind of a nice bookend to 16 years of research,” says <a href="http://web.stanford.edu/group/dlab/research.html">Deisseroth</a>, a neuroscientist and bioengineer at Stanford University. “It took years and years for us to sort out how to make it work.” The result is described this month in the journal <em><a href="https://www.nature.com/articles/s41587-020-0679-9">Nature Biotechnology</a></em>.</p> 
         1584 <!--nextpage--> 
         1585 <p></p> 
         1586 <p>Optogenetics involves genetically engineering animal brains to express light-sensitive proteins—called opsins—in the membranes of neurons.&nbsp;The opsins’ reactions to pulses of light&nbsp;can either induce a neuron to “fire” or suppress its ability to fire. Optogenetics has been used to map brain pathways, identify how complex behaviors are regulated, create <a href="https://science.sciencemag.org/content/341/6144/387.long">false memories</a> in mice, and <a href="https://www.nature.com/articles/nn.4091">much more</a>. It’s also been used to develop an <a href="/the-human-os/biomedical/devices/scientists-control-a-flys-heart-with-a-laser">optogenetic pacemaker</a>, among other technologies.</p> 
         1587 <p></p> 
         1588 <p>Most of the time, getting the pulses of light inside the brain to control&nbsp;cells has required invasive <a href="/the-human-os/biomedical/devices/one-step-optogenetics">implants</a>: from tethered optical <a href="/tech-talk/biomedical/devices/lasers-switch-memories-from-bad-to-good">fibers</a>, to peppercorn-sized <a href="/biomedical/devices/neuroscientists-wirelessly-control-the-brain-of-a-scampering-lab-mouse">wireless implants</a>, to stretchy <a href="/the-human-os/biomedical/bionics/flexible-optogenetics-implants-hack-the-sense-of-pain">spinal implants</a>.</p> 
         1589 <p></p> 
         1590 <p>In April, <a href="https://mcgovern.mit.edu/profile/guoping-feng/">Guoping Feng</a> and colleagues at MIT, along with Deisseroth,&nbsp;<a href="https://www.cell.com/neuron/fulltext/S0896-6273(20)30239-7?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0896627320302397%3Fshowall%3Dtrue">demonstrated</a> a minimally invasive optogenetic system that required drilling a small hole in the skull, then being&nbsp;<span>able to control opsin-expressing neurons six millimeters deep into the brain using blue light. This approach used of a type of opsin that slowly activates neurons in a step-wise manner.&nbsp;</span></p> 
         1591 <p></p> 
         1592 <p>In the most recent study,&nbsp;<span>Deisseroth and colleagues sought to instead enable both deep and fast optogenetics without surgery.</span> The Stanford team expressed <span>in the brain cells of mice&nbsp;</span>a powerful new opsin called ChRmine (pronounced like the deep-red color “carmine”),&nbsp;<a href="https://science.sciencemag.org/content/365/6453/eaaw5202">discovered</a> by Deisseroth’s group last year in a marine organism. Then, they shined a red light outside the skull and were able to activate neural circuits in the midbrain and brainstem at depths of up to 7 millimeters. With the technique, the scientists turned on and off brain circuits with millisecond precision. “It really worked well, far better than we even expected might be possible,” says Deisseroth.</p> 
         1593 <p></p> 
         1594 <p>The team then tested the effectiveness of the system. In one instance, they used light to quickly and precisely stop seizures in epileptic mice, and in another to turn on serotonin-producing neurons to promote social behavior in mice.</p> 
         1595 <p></p> 
         1596 <p>Most optogenetic techniques involve injecting viruses with an opsin gene of choice directly into the brain with a needle. To avoid this, the Stanford team used a type of PHP virus <a href="https://www.caltech.edu/about/news/delivering-genes-across-blood-brain-barrier-49679">developed at CalTech</a> that can be injected in the blood. The virus then crosses the blood-brain barrier to deliver its payload, an opsin gene, to brain cells. In this case, even the delivery of the gene is noninvasive—no needle penetrates the brain.</p> 
         1597 <p></p> 
         1598 <p>Deisseroth’s team is now testing the non-invasive technique in fish and collaborating with others to apply it to non-human primates. They’re also working with the Seattle-based <a href="https://alleninstitute.org/">Allen Institute</a> to develop mouse lines bred with ChRmine in their cells. “We hope these will be a broadly available and applicable research tool,” says Deisseroth. “We’re just excited to share this capability with everybody.”</p>]]></content:encoded>
         1599       <dc:creator>Megan Scudellari</dc:creator>
         1600       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODU1MQ.jpeg" />
         1601       <media:content url="https://spectrum.ieee.org/image/MzcwODU1MQ.jpeg" />
         1602     </item>
         1603     <item>
         1604       <title>Should the DoD’s Tech Professionals Work From Home—Permanently?</title>
         1605       <link>https://spectrum.ieee.org/view-from-the-valley/at-work/tech-careers/should-the-dods-tech-professionals-work-from-homepermanently</link>
         1606       <description>The Defense Innovation Board recommends continuing remote work beyond the pandemic, at home and in a nationwide network of coworking spaces</description>
         1607       <category>at-work</category>
         1608       <category>at-work/tech-careers</category>
         1609       <pubDate>Mon, 26 Oct 2020 15:24:00 GMT</pubDate>
         1610       <guid>https://spectrum.ieee.org/view-from-the-valley/at-work/tech-careers/should-the-dods-tech-professionals-work-from-homepermanently</guid>
         1611       <content:encoded><![CDATA[<p class="MsoNormal"><span>With tech companies announcing plans to continue allowing—and even encouraging—employees to work remotely beyond the end of the pandemic, the <a href="https://innovation.defense.gov/">U.S. Defense Innovation Board</a> has urged the Department of Defense to improve its own work from home policies.</span></p> 
         1612 <p class="MsoNormal"><span>In <a href="https://innovation.defense.gov/Portals/63/documents/Meeting%20Documents/September%2015%202020/DIB_Digital%20Talent_CLEARED.pdf?ver=2020-09-15-111827-080">a September report</a>, the board, an advisory committee to the Secretary of Defense, pointed out that the DoD “has traditionally struggled to compete for digital talent,” and “the emerging work from home norm creates an opening for the Department to either adapt and narrow the gap or fall further behind in competing for top-notch technical talent.”</span></p> 
         1613 <p class="MsoNormal"><span>Right now, the Department of Defense is allowing some employees to work remotely, using standard remote collaboration tools with an extra layer of security, but has not decided whether use of these tools will be permitted after workers return to the office. The Defense Innovation Board’s report argues that not only should these tools be preserved, but the use of such tools, along with accompanying infrastructure upgrades, should be expanded. Embracing remote work permanently would, the report claims, allow the DoD to hire a “more agile, diverse, and distributed workforce.”</span></p> 
         1614 <p class="MsoNormal"><span>In addition to urging the DoD to follow in the footsteps of commercial tech employers, the Defense Innovation Board made a few suggestions that I haven’t seen coming from tech businesses, and which those firms might want to embrace in return. </span></p> 
         1615 <aside class="inlay pullquote rt med">
         1616   Senior DoD leaders “should commit to periodically working from home to model behavior, norms, and expectations around performance and presence” 
         1617 </aside> 
         1618 <p class="MsoNormal"><span>For one, the Board suggested that the DoD create “a nationwide network of dedicated co-working or shared workspaces” for remote work. This, it suggested, might be a way of handling classified work in a more distributed fashion, but it also could be a way for businesses to better fulfill <a href="/view-from-the-valley/at-work/tech-careers/will-the-tech-workplace-ever-be-the-same-again">the desires of employees</a> to live wherever they want, but work some number of days each week at home and some in an office.</span></p> 
         1619 <p class="MsoNormal"><span>In another suggestion, the Board urged that, as part of an effort to change the culture around remote work, that senior DoD leaders “should commit to periodically working from home to model behavior, norms, and expectations around performance and presence; this will also create a demand for IT capabilities to remain up-to-date and not atrophy.”</span></p> 
         1620 <p class="MsoNormal"><span>&nbsp;</span></p>]]></content:encoded>
         1621       <dc:creator>Tekla S. Perry</dc:creator>
         1622       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODMxMQ.jpeg" />
         1623       <media:content url="https://spectrum.ieee.org/image/MzcwODMxMQ.jpeg" />
         1624     </item>
         1625     <item>
         1626       <title>Video Friday: Sarcos Is Developing a New Teleoperated Dexterous Robot</title>
         1627       <link>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-sarcos-guardian-xt-teleoperated-dexterous-robot</link>
         1628       <description>Your weekly selection of awesome robot videos</description>
         1629       <category>robotics</category>
         1630       <category>robotics/robotics-hardware</category>
         1631       <pubDate>Fri, 23 Oct 2020 19:00:00 GMT</pubDate>
         1632       <guid>https://spectrum.ieee.org/automaton/robotics/robotics-hardware/video-friday-sarcos-guardian-xt-teleoperated-dexterous-robot</guid>
         1633       <content:encoded><![CDATA[<p>Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (<a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a>!):</p> 
         1634 <h5><a href="http://www.iros2020.org/">IROS 2020</a> –&nbsp;October 25-29, 2020 –&nbsp;[Online]</h5> 
         1635 <h5><a href="https://roscon.ros.org/world/2020/">ROS World 2020</a> –&nbsp;November 12, 2020 –&nbsp;[Online]</h5> 
         1636 <h5><a href="https://cybathlon.ethz.ch/en/">CYBATHLON 2020</a> –&nbsp;November 13-14, 2020 –&nbsp;[Online]</h5> 
         1637 <h5><a href="https://sites.psu.edu/icsr2020/">ICSR 2020</a> –&nbsp;November 14-16, 2020 –&nbsp;Golden, Colo., USA</h5> 
         1638 <p><a href="mailto:automaton@ieee.org?subject=Robot%20video%20suggestion%20for%20Video%20Friday">Let us know</a> if you have suggestions for next week, and enjoy today's videos.</p> 
         1639 <hr> 
         1640 <!--nextpage--> 
         1641 <blockquote> 
         1642  <p><em><a href="/tech-talk/aerospace/robotic-exploration/the-long-arm-of-nasa-the-osirisrex-spacecraft-gets-ready-to-grab-an-asteroid-sample">NASA’s Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) spacecraft</a> unfurled its robotic arm Oct. 20, 2020, and in a first for the agency, briefly touched an asteroid to collect dust and pebbles from the surface for delivery to Earth in 2023.</em></p> 
         1643 </blockquote> 
         1644 <p></p> 
         1645 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/xj0O-fLSV7c" width="620"></iframe></p> 
         1646 <p></p> 
         1647 <p>[ <a href="https://www.nasa.gov/feature/goddard/2020/osiris-rex-tags-surface-of-asteroid-bennu">NASA</a> ]</p> 
         1648 <p></p> 
         1649 <hr> 
         1650 <p></p> 
         1651 <p>New from David Zarrouk’s lab at BGU is AmphiSTAR, which Zarrouk describes as “a kind of a ground-water drone inspired by the cockroaches (sprawling) and by the Basilisk lizard (running over water). The robot hovers due to the collision of its propellers with the water (hydrodynamics not aerodynamics). The robot can crawl and swim at high and low speeds and smoothly transition between the two. It can reach 3.5 m/s on ground and 1.5m/s in water.”</p> 
         1652 <p></p> 
         1653 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/qXgPQ7_yld0" width="620"></iframe></p> 
         1654 <p></p> 
         1655 <p>AmphiSTAR will be presented at IROS, starting next week!</p> 
         1656 <p>[ <a href="https://designandrobotics.weebly.com/">BGU</a> ]</p> 
         1657 <p></p> 
         1658 <hr> 
         1659 <p></p> 
         1660 <p>This is unfortunately not a great video of a video that was taken at a SoftBank Hawks baseball game in Japan last week, but it’s showing an Atlas robot doing an honestly kind of impressive dance routine to support the team.</p> 
         1661 <div class="embedcode"> 
         1662  <blockquote class="twitter-tweet" data-dnt="true"> 
         1663   <p>ロボット応援団に人型ロボット『ATLAS』がアメリカからリモートで緊急参戦!!!<br> ホークスビジョンの映像をお楽しみ下さい♪<a href="https://twitter.com/hashtag/sbhawks?src=hash&amp;ref_src=twsrc%5Etfw">#sbhawks</a> <a href="https://twitter.com/hashtag/Pepper?src=hash&amp;ref_src=twsrc%5Etfw">#Pepper</a> <a href="https://twitter.com/hashtag/spot?src=hash&amp;ref_src=twsrc%5Etfw">#spot</a> <a href="https://t.co/6aTYn8GGli">pic.twitter.com/6aTYn8GGli</a></p> — 福岡ソフトバンクホークス(公式) (@HAWKS_official) 
         1664   <a href="https://twitter.com/HAWKS_official/status/1317068357779124224?ref_src=twsrc%5Etfw">October 16, 2020</a> 
         1665  </blockquote> 
         1666  <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> 
         1667 </div> 
         1668 <p><em><strong>Editor’s Note: </strong>The tweet embed above is not working for some reason—see the video <a href="https://twitter.com/HAWKS_official/status/1317068357779124224?ref_src=twsrc%5Etfw">here</a>.</em></p> 
         1669 <p>[ <a href="https://www.softbankhawks.co.jp/global/english/index.html">SoftBank Hawks</a> ]</p> 
         1670 <p><em>Thanks Thomas!</em></p> 
         1671 <p></p> 
         1672 <hr> 
         1673 <p></p> 
         1674 <p><a href="/tag/Sarcos+Robotics">Sarcos</a> is working on a new robot, which looks to be the torso of their <a href="/automaton/robotics/industrial-robots/sarcos-guardian-xo-powered-exoskeleton">powered exoskeleton</a> with the human relocated somewhere else.</p> 
         1675 <p></p> 
         1676 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/4urGVpy5HiA" width="620"></iframe></p> 
         1677 <p></p> 
         1678 <p>[ <a href="https://www.sarcos.com/products/guardian-xt/">Sarcos</a> ]</p> 
         1679 <p></p> 
         1680 <hr> 
         1681 <p></p> 
         1682 <p>The biggest holiday of the year, International Sloth Day, was on Tuesday! To celebrate, here’s <a href="/automaton/robotics/robotics-hardware/why-we-need-robot-sloths">Slothbot</a>!</p> 
         1683 <p></p> 
         1684 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/bVOQ62YrXbI" width="620"></iframe></p> 
         1685 <p></p> 
         1686 <p>[ <a href="https://www.nsf.gov/news/mmg/mmg_disp.jsp?med_id=186892&amp;from=">NSF</a> ]</p> 
         1687 <p></p> 
         1688 <hr> 
         1689 <p></p> 
         1690 <p>This is one of those simple-seeming tasks that are really difficult for robots.</p> 
         1691 <p></p> 
         1692 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/0IlVgX9CtkU" width="620"></iframe></p> 
         1693 <p></p> 
         1694 <p>I love self-resetting training environments.</p> 
         1695 <p>[ <a href="https://www.csail.mit.edu/news/robot-stably-swings-objects-specific-poses">MIT CSAIL</a> ]</p> 
         1696 <p></p> 
         1697 <hr> 
         1698 <p></p> 
         1699 <blockquote> 
         1700  <p><em>The Chiel lab collaborates with engineers at the Center for Biologically Inspired Robotics Research at Case Western Reserve University to design novel worm-like robots that have potential applications in search-and-rescue missions, endoscopic medicine, or other scenarios requiring navigation through narrow spaces.</em></p> 
         1701 </blockquote> 
         1702 <p></p> 
         1703 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/uu7ADgg3u5I" width="620"></iframe></p> 
         1704 <p></p> 
         1705 <p>[ <a href="https://biology.case.edu/faculty/hillel-chiel/">Case Western</a> ]</p> 
         1706 <p></p> 
         1707 <hr> 
         1708 <p></p> 
         1709 <blockquote> 
         1710  <p><em>ANYbotics partnered with Losinger Marazzi to explore ANYmal’s potential of patrolling construction sites to identify and report safety issues. With such a complex environment, only a robot designed to navigate difficult terrain is able to bring digitalization to such a physically demanding industry.</em></p> 
         1711 </blockquote> 
         1712 <p></p> 
         1713 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/QjOlv45OSwA" width="620"></iframe></p> 
         1714 <p></p> 
         1715 <p>[ <a href="https://www.anybotics.com/anymal-tackles-construction-safety/">ANYbotics</a> ]</p> 
         1716 <p></p> 
         1717 <hr> 
         1718 <p></p> 
         1719 <p>Happy 2018 Halloween from Clearpath Robotics!</p> 
         1720 <p></p> 
         1721 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/FA5jCmC-Uaw" width="620"></iframe></p> 
         1722 <p></p> 
         1723 <p>[ <a href="https://clearpathrobotics.com/">Clearpath</a> ]</p> 
         1724 <p></p> 
         1725 <hr> 
         1726 <p></p> 
         1727 <blockquote> 
         1728  <p><em>Overcoming illumination variance is a critical factor in vision-based navigation. Existing methods tackled this radical illumination variance issue by proposing camera control or high dynamic range (HDR) image fusion. Despite these efforts, we have found that the vision-based approaches still suffer from overcoming darkness. This paper presents real-time image synthesizing from carefully controlled seed low dynamic range (LDR) image, to enable visual simultaneous localization and mapping (SLAM) in an extremely dark environment (less than 10 lux).</em></p> 
         1729 </blockquote> 
         1730 <p></p> 
         1731 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/XmmJBgy5PbQ" width="620"></iframe></p> 
         1732 <p></p> 
         1733 <p>[ <a href="https://irap.kaist.ac.kr/">KAIST</a> ]</p> 
         1734 <p></p> 
         1735 <hr> 
         1736 <p></p> 
         1737 <p>What can MoveIt do? Who knows! Let's find out!</p> 
         1738 <p></p> 
         1739 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/7KvF7Dj7bz0" width="620"></iframe></p> 
         1740 <p></p> 
         1741 <p>[ <a href="https://moveit.ros.org/">MoveIt</a> ]</p> 
         1742 <p><em>Thanks Dave!</em></p> 
         1743 <p></p> 
         1744 <hr> 
         1745 <p></p> 
         1746 <blockquote> 
         1747  <p><em>Here we pick a cube from a starting point, manipulate it within the hand, and then put it back. To explore the capabilities of the hand, no sensors were used in this demonstration. The RBO Hand 3 uses soft pneumatic actuators made of silicone. The softness imparts considerable robustness against variations in object pose and size. This lets us design manipulation funnels that work reliably without needing sensor feedback. We take advantage of this reliability to chain these funnels into more complex multi-step manipulation plans.</em></p> 
         1748 </blockquote> 
         1749 <p></p> 
         1750 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/VlO6huZAleM" width="620"></iframe></p> 
         1751 <p></p> 
         1752 <p>[ <a href="https://www.robotics.tu-berlin.de/menue/home/">TU Berlin</a> ]</p> 
         1753 <p></p> 
         1754 <hr> 
         1755 <p></p> 
         1756 <p>If this was a real solar array, <a href="/automaton/robotics/robotics-hardware/inflatable-robots-for-space">King Louie would have totally cleaned it</a>. Mostly.</p> 
         1757 <p></p> 
         1758 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/LPcKy92VAyw" width="620"></iframe></p> 
         1759 <p></p> 
         1760 <p>[ <a href="http://radlab.byu.edu/model-predictive-control-of-compliant-robots.html">BYU</a> ]</p> 
         1761 <p></p> 
         1762 <hr> 
         1763 <p></p> 
         1764 <blockquote> 
         1765  <p><em>Autonomous exploration is a fundamental problem for various applications of unmanned aerial vehicles(UAVs). Existing methods, however, were demonstrated to have low efficiency, due to the lack of optimality consideration, conservative motion plans and low decision frequencies. In this paper, we propose FUEL, a hierarchical framework that can support Fast UAV ExpLoration in complex unknown environments.</em></p> 
         1766 </blockquote> 
         1767 <p></p> 
         1768 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/_dGgZUrWk-8" width="620"></iframe></p> 
         1769 <p></p> 
         1770 <p>[ <a href="https://github.com/HKUST-Aerial-Robotics/FUEL">HKUST</a> ]</p> 
         1771 <p></p> 
         1772 <hr> 
         1773 <p></p> 
         1774 <blockquote> 
         1775  <p><em>Countless precise repetitions? This is the perfect task for a robot, thought researchers at the University of Liverpool in the Department of Chemistry, and without further ado they developed an automation solution that can carry out and monitor research tasks, making autonomous decisions about what to do next.</em></p> 
         1776 </blockquote> 
         1777 <p></p> 
         1778 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/7MFmbtIb8xA" width="620"></iframe></p> 
         1779 <p></p> 
         1780 <p>[ <a href="https://www.blog.kuka.com/2020/08/11/experiments-with-robot-support-automation-in-the-chemistry-laboratory/?lang=en">Kuka</a> ]</p> 
         1781 <p></p> 
         1782 <hr> 
         1783 <p></p> 
         1784 <blockquote> 
         1785  <p><em>This video shows a demonstration of central results of the SecondHands project. In the context of maintenance and repair tasks, in warehouse environments, the collaborative humanoid robot ARMAR-6 demonstrates a number of cognitive and sensorimotor abilities such as 1) recognition of the need of help based on speech, force, haptics and visual scene and action interpretation, 2) collaborative bimanual manipulation of large objects, 3) compliant mobile manipulation, 4) grasping known and unknown objects and tools, 5) human-robot interaction (object and tool handover) 6) natural dialog and 7) force predictive control.</em></p> 
         1786 </blockquote> 
         1787 <p></p> 
         1788 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/6cDgVrwchSg" width="620"></iframe></p> 
         1789 <p></p> 
         1790 <p>[ <a href="https://secondhands.eu/">SecondHands</a> ]</p> 
         1791 <p></p> 
         1792 <hr> 
         1793 <p></p> 
         1794 <p>In celebration of Ada Lovelace Day, Silicon Valley Robotics hosted a panel of Women in Robotics.</p> 
         1795 <p></p> 
         1796 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/NuN1fpdYYI4" width="620"></iframe></p> 
         1797 <p></p> 
         1798 <p>[ <a href="https://robohub.org/women-in-robotics-panel-celebrating-ada-lovelace-day/">Robohub</a> ]</p> 
         1799 <p></p> 
         1800 <hr> 
         1801 <p></p> 
         1802 <p>As part of the upcoming virtual IROS conference, HEBI robotics is putting together a tutorial on robotics actuation. While I’m sure HEBI would like you to take a long look at their own actuators, we’ve been assured that no matter what kind of actuators you use, this tutorial will still be informative and useful.</p> 
         1803 <p></p> 
         1804 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/7JrBDRNPxv4" width="620"></iframe></p> 
         1805 <p></p> 
         1806 <p>[ <a href="https://www.youtube.com/watch?v=7JrBDRNPxv4&amp;list=PLYy-ocPrXDOUovtgclU7kgqqlnC1-Nzfr&amp;index=1&amp;ab_channel=HEBIRobotics">YouTube</a> ] via [ <a href="https://www.hebirobotics.com/">HEBI Robotics</a> ]</p> 
         1807 <p><em>Thanks Dave!</em></p> 
         1808 <p></p> 
         1809 <hr> 
         1810 <p></p> 
         1811 <p>This week’s UMD Lockheed Martin Robotics Seminar comes from Julie Shah at MIT, on “Enhancing Human Capability with Intelligent Machine Teammates.”</p> 
         1812 <p></p> 
         1813 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/lwbmDj0_QhM" width="620"></iframe></p> 
         1814 <p></p> 
         1815 <blockquote> 
         1816  <p><em>Every team has top performers- people who excel at working in a team to find the right solutions in complex, difficult situations. These top performers include nurses who run hospital floors, emergency response teams, air traffic controllers, and factory line supervisors. While they may outperform the most sophisticated optimization and scheduling algorithms, they cannot often tell us how they do it. Similarly, even when a machine can do the job better than most of us, it can’t explain how. In this talk I share recent work investigating effective ways to blend the unique decision-making strengths of humans and machines. I discuss the development of computational models that enable machines to efficiently infer the mental state of human teammates and thereby collaborate with people in richer, more flexible ways.</em></p> 
         1817 </blockquote> 
         1818 <p>[ <a href="https://robotics.umd.edu/">UMD</a> ]</p> 
         1819 <p></p> 
         1820 <hr> 
         1821 <p></p> 
         1822 <p>Matthew Piccoli gives a talk to the UPenn GRASP Lab on “Trading Complexities: Smart Motors and Dumb Vehicles.”</p> 
         1823 <p></p> 
         1824 <p><iframe allowfullscreen frameborder="0" height="349" src="//www.youtube.com/embed/7uyGy5xEpxo" width="620"></iframe></p> 
         1825 <p></p> 
         1826 <blockquote> 
         1827  <p><em>We will discuss my research journey through Penn making the world's smallest, simplest flying vehicles, and in parallel making the most complex brushless motors. What do they have in common? We'll touch on why the quadrotor went from an obscure type of helicopter to the current ubiquitous drone. Finally, we'll get into my life after Penn and what tools I'm creating to further drone and robot designs of the future.</em></p> 
         1828 </blockquote> 
         1829 <p>[ <a href="https://www.grasp.upenn.edu/">UPenn</a> ]</p> 
         1830 <p></p> 
         1831 <hr> 
         1832 <p></p>]]></content:encoded>
         1833       <dc:creator>Evan Ackerman</dc:creator>
         1834       <dc:creator>Erico Guizzo</dc:creator>
         1835       <dc:creator>Fan Shi</dc:creator>
         1836       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwODEwOQ.jpeg" />
         1837       <media:content url="https://spectrum.ieee.org/image/MzcwODEwOQ.jpeg" />
         1838     </item>
         1839     <item>
         1840       <title>Researchers at the University of Tehran Devise New Sputum Test for COVID-19</title>
         1841       <link>https://spectrum.ieee.org/news-from-around-ieee/the-institute/ieee-member-news/researchers-at-the-university-of-tehran-devise-new-sputum-test-for-covid19</link>
         1842       <description>This electrochemical diagnostic tool uses carbon nanotubes to diagnose an upper respiratory infection in 30 seconds</description>
         1843       <category>the-institute</category>
         1844       <category>the-institute/ieee-member-news</category>
         1845       <pubDate>Fri, 23 Oct 2020 18:00:00 GMT</pubDate>
         1846       <guid>https://spectrum.ieee.org/news-from-around-ieee/the-institute/ieee-member-news/researchers-at-the-university-of-tehran-devise-new-sputum-test-for-covid19</guid>
         1847       <content:encoded><![CDATA[<link href="/ns/interactive/0118race-to-5g/css/5g-logo-treatments.css" rel="stylesheet"> 
         1848 <style type="text/css">.entry-content .tisubhead {
         1849     color: #999999;
         1850     font-family: verdana;
         1851     font-size: 14px;
         1852     font-weight: bold;
         1853     letter-spacing: 1px;
         1854     margin-bottom: -5px !important;
         1855     text-transform: uppercase;
         1856 }
         1857 
         1858 .tiblogopener {
         1859     color: #a17e54;
         1860    font-family: Theinhardt-Medium, sans-serif;
         1861   letter-spacing: 1px;
         1862   margin-right: 10px;
         1863     font-weight: bold;
         1864     text-transform: uppercase;
         1865 }
         1866 </style> 
         1867 <div class="mobileHide"> 
         1868  <div class="imgWrapper offsetLeft lt sm"> 
         1869   <a href="/static/covid19-ieee-resources"><img alt="IEEE COVID-19 coverage logo, link to landing page" src="/image/MzYxMjU3Mg.jpeg"></a> 
         1870  </div> 
         1871 </div> 
         1872 <p><span class="tiblogopener">THE INSTITUTE </span>Nasopharyngeal swabs are the most common way to collect a sample from a person in order to test her for COVID-19. Retrieving the specimen requires a medical professional to insert a long shaft into a person’s nasal cavity. The procedure is often uncomfortable for people and requires medical professionals to break social distancing parameters.</p> 
         1873 <p>IEEE Member <a href="https://www.linkedin.com/in/mohammad-abdolahad-1a0aa1a0/">Mohammad Abdolahad</a> led a team of undergraduate students and post-doctoral candidates at the <a href="https://ut.ac.ir/en">University of Tehran</a> that developed a non-invasive, electrochemical diagnostic system. Called the ROS [reactive oxygen species] Detector in Sputum Sample (RDSS), the test screens for respiratory inflammation in real-time and doesn’t require a medical professional to swab for the specimen. ROS are reactive&nbsp;<a href="https://en.wikipedia.org/wiki/Chemical_species">chemical species</a> that&nbsp;contain oxygen and can severely damage DNA, RNA, and proteins. This tool can determine the presence of ROS produced by respiratory inflammation.</p> 
         1874 <p>Abdolahad is an associate professor of electrical engineering at the <a href="https://ut.ac.ir/en/page/329/school-of-electrical-computer-engineering">University of Tehran’s School of Electrical and Computer Engineering</a> as well as an adjunct professor at the university’s <a href="http://en.tums.ac.ir/en">School of Medical Sciences</a>.</p> 
         1875 <p><a href="/the-institute"><em>The Institute</em></a>&nbsp;asked him about how RDSS works.</p> 
         1876 <p><em>This interview has been edited and condensed for clarity.</em></p> 
         1877 <p></p> 
         1878 <p><strong>What problem are&nbsp;you&nbsp;trying to solve? </strong></p> 
         1879 <p>Since controlling the spread of the virus [largely] depends on screening suspected cases, it is important to have widely available, reliable, and fast [testing] methods. Unfortunately, the current screening methods, such as Polymerase chain reaction, do not satisfy these requirements. [PCR checks for the presence of the SARS-CoV-2 virus, which causes COVID-19]</p> 
         1880 <p>Consequently, we have developed a fast method to screen for respiratory inflammation [in] real-time.<strong> </strong>The test can also help inform doctors if the patient has an increased chance of contracting COVID-19. Respiratory diseases can make a patient immunoresistant and by being diagnosed, the patient now knows that she needs to take additional steps in order to protect herself against coronavirus.</p> 
         1881 <p><strong>Explain how the system works.</strong></p> 
         1882 <p>The ROS test is done by taking a sample of the patient’s sputum. [The patient takes a deep breath and holds it in for five seconds. She then slowly breathes out and repeats these steps until she coughs up sputum.] The patient then spits the sputum into a falcon tube [a plastic cup].</p> 
         1883 <p>Each individual sample is tested using the RDSS probe. The doctor [puts] the probe into the sample and the results are [displayed] on the monitor after 30 seconds.</p> 
         1884 <p><strong>What technologies are you using? </strong></p> 
         1885 <p>The system consists of <span class="fontstyle01">an integrated monitor that connects to a probe, which has a disposable sensor located on top of it. The probe is used to test the [sputum] samples<span class="fontstyle01"> and the monitor displays the results to the medical professional conducting the test. </span></span></p> 
         1886 <p><span class="fontstyle01">The <span class="fontstyle01">sensor on top of the probe is fabricated using multi-wall carbon nanotubes, which sit <span class="fontstyle01">on the tip of several steel needles. The needles are arranged in three <span class="fontstyle01">electrodes—working, counter, and reference—with a triangular distance of 3 millimeters from each other. [Reference electrodes measure the potential of the working electrode without passing current through it while counter electrodes pass current.]</span></span></span></span></p> 
         1887 <p><span class="fontstyle01">The tool [is portable], which allows the device to be utilized freely by phlebotomists and physicians in laboratories or clinics. </span></p> 
         1888 <p><span class="fontstyle01">The software [programmed in the device] <span class="fontstyle01">was designed based on experimental calibration [in order to] analyze the data and provide a <span class="fontstyle01">diagnosis in under 30 seconds.</span></span></span></p> 
         1889 <p><strong>What challenges have you faced, and how did you overcome&nbsp;them?</strong></p> 
         1890 <p>The first challenge was calibrating the sensor in correlation with the presence and severity of COVID-19 in the [patients].</p> 
         1891 <p>We conducted a study and tested more than 100 people to better understand the differences between COVID-19 and [other types of] <span class="fontstyle01">respiratory diseases. We found that in some respiratory illnesses, such as asthma and acute pneumonia, there is an increase in ROS. Seasonal influenza on the other hand induces a reduction in ROS levels [in the] immune system and suppresses certain bacterial clearance [the effect a drug has on&nbsp;bacteria].</span></p> 
         1892 <p>The other challenge that we faced was collecting enough data to calibrate the sensor. It was a challenge to find participants for the study due to quarantine restrictions and the danger of working closely with infected cases. [<span class="fontstyle01">In the end we were able to] test the sensor on more than 300 participants—both confirmed COVID-19 cases and negative cases.</span></p> 
         1893 <p><strong>What is the potential impact of the technology?</strong></p> 
         1894 <p>A real-time ROS-based respiratory inflammation warning system during the pandemic could help control the spread of the virus. It can [also] be used as a support system to help determine the severity of respiratory inflammatory diseases based on ROS levels in the patient’s sputum culture.</p> 
         1895 <p><strong>How&nbsp;close&nbsp;are you&nbsp;to the final&nbsp;product?&nbsp;</strong></p> 
         1896 <p>We [completed developing the system] and received a temporary certificate from the Iranian Food and Drug Administration that allows us to sell the system to medical centers. Our U.S. patent was also received its Notice of the Office communication on four main claims and passed the examiner queries. Hence it will be granted soon. &nbsp;</p> 
         1897 <p>The sensor has been deployed in four hospitals, as a non-invasive real-time complementary system, for further observational clinical trials.</p> 
         1898 <p><strong>How can other IEEE members get involved?</strong></p> 
         1899 <p>We have only tested [the system] on [patients] in Iran [and] the system can be improved by [testing samples in other countries]. Researchers can also try to find alternative, [inexpensive] materials [to use] as sensing agents for the ROS detection system.</p> 
         1900 <p>IEEE members who work in similar areas can help test ROS levels in the sputum culture of COVID-19 patients who were treated. This would help us find a suitable drug dose to treat the patients [with] and [better understand how] to monitor the severity of the patients’ symptoms.</p> 
         1901 <div class="mobileShow"> 
         1902  <h3 class="RptHdBackBarMobile"><span class="BackArrowBlkBkgrd">&lt;</span>&nbsp;<a href="/static/covid19-ieee-resources">Back to IEEE COVID-19 Resources</a></h3> 
         1903 </div>]]></content:encoded>
         1904       <dc:creator>The Institute’s Editorial Staff</dc:creator>
         1905       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNzczOA.jpeg" />
         1906       <media:content url="https://spectrum.ieee.org/image/MzcwNzczOA.jpeg" />
         1907     </item>
         1908     <item>
         1909       <title>Researchers Pack 10,000 Metasurface Pixels Per Inch in New OLED Display</title>
         1910       <link>https://spectrum.ieee.org/tech-talk/consumer-electronics/audiovideo/metasurface-oled-display</link>
         1911       <description>The novel display from Samsung and Stanford could find use in VR, AR</description>
         1912       <category>consumer-electronics</category>
         1913       <category>consumer-electronics/audiovideo</category>
         1914       <pubDate>Fri, 23 Oct 2020 16:53:00 GMT</pubDate>
         1915       <guid>https://spectrum.ieee.org/tech-talk/consumer-electronics/audiovideo/metasurface-oled-display</guid>
         1916       <content:encoded><![CDATA[<p><span>A new OLED display from Samsung and Stanford can achieve more than 10,000 pixels per inch, which might lead to advanced virtual reality and augmented reality displays, researchers say.</span></p> 
         1917 <p></p> 
         1918 <p><span>An <a href="/tech-talk/semiconductors/optoelectronics/the-heat-is-on-for-better-oleds">organic light-emitting diode</a> (OLED) display possesses a film of organic compounds that emits light in response to an electric current. A commercial large-scale OLED television might have a pixel density of about 100 to 200 pixels per inch (PPI), whereas a mobile phone's OLED display might achieve 400 to 500 PPI.</span></p> 
         1919 <p></p> 
         1920 <p><span>Two different kinds of OLED displays have reached commercial success in mobile devices and large-scale TVs. Mobile devices mostly used red, green and blue OLEDs, which companies manufacture by depositing dots of organic film through metal sheets with many tiny holes punched in them. However, the thickness of these metal sheets limits how small these fabricated dots can be and sagging of these metal sheets limits how large these displays can get.</span></p> 
         1921 <p></p> 
         1922 <p><span>In contrast, large-scale TVs use white OLEDs with color filters placed over them. However, these filters absorb more than 70% of light from the OLEDs. As such, these displays are power-hungry and can suffer "burn-in" of images that linger too long. The filters also limit how much the pixels can scale down in size.</span></p> 
         1923 <p><span>The new display uses OLED films to emit white light between two reflective layers, one of which is made of a silver film, whereas the other is a "<a href="/tech-talk/semiconductors/optoelectronics/metasurfaces-metamaterials-image-processing">metasurface</a>," or forest of microscopic pillars each spaced less than a wavelength of light apart. Square clusters of these 80-nanometer-high, 100-nanometer-wide silver pillars served as pixels each roughly 2.4 microns wide, or slightly less than 1/10,000th of an inch.</span></p> 
         1924 <p></p> 
         1925 <p><span>Each pixel in the new display's metasurface is divided into four subpixels of equal size. In principle, the OLED films can specify which subpixels they illuminate. The nano-pillars in each subpixel manipulate white light falling onto them, such that each subpixel can reflect a specific color of light, depending on the amount of spacing between its nano-pillars. In each pixel, the subpixel with the most densely packed nano-pillars yields red light; the one with moderately densely packed nano-pillars yields green light; and the two with the least densely packed nano-pillars yield blue light.</span></p> 
         1926 <figure class="xlrg" role="img"> 
         1927  <img alt="A scanning electron microscopy image of the forest of nano-pillars that underlies the new OLED display. This array serves as a series of reflective cavities that define the display's pixels. Credit: Mark Brongersma et al., Science." src="/image/MzcwNzU3NQ.jpeg"> 
         1928  <figcaption class="hi-cap">
         1929    Image:&nbsp;Mark Brongersma/Science 
         1930  </figcaption> 
         1931  <figcaption> 
         1932   <p>A scanning electron microscopy image of the forest of nano-pillars that underlies the new OLED display. This array&nbsp;serves as a series of reflective cavities that define the display's pixels.&nbsp;</p> 
         1933  </figcaption> 
         1934 </figure> 
         1935 <p></p> 
         1936 <p><span>Emitted light reflects back and forth between the display's reflective layers until it finally escapes through the silver film out the display's surface. The way in which light can build up within the display gives it twice the luminescence efficiency of standard color-filtered white OLED displays, as well as higher color purity, the researchers say.</span></p> 
         1937 <p></p> 
         1938 <p><span>"If you think of a musical instrument, you often see an acoustic cavity that sounds come out of that helps make a nice and beautiful pure tone," </span>says study senior author Mark Brongersma, an optical engineer at Stanford University. <span>"The same happens here with light — the different colors of light can resonate in these pixels."</span></p> 
         1939 <p></p> 
         1940 <p><span>In the near term, one potential application for this new display is with virtual reality (VR). Since VR headsets place their displays close to a user's eyes, high-resolutions are key to help create the illusion of reality, Brongersma says.</span></p> 
         1941 <p></p> 
         1942 <p><span>As impressive as 10,000 pixels per inch might sound, "according to our simulation results, the theoretical scaling limit of pixel density is estimated to be 20,000 pixels per inch," says study lead author Won-Jae Joo, a nanophotonic engineer at the Samsung Advanced Institute of Technology in Suwon, Korea. "The challenge is the trade-off in brightness when the pixel dimensions go below one micrometer."</span></p> 
         1943 <p></p> 
         1944 <p><span>Other research groups have developed displays they say range from <a href="https://www.jb-display.com/about">10,000</a> to <a href="https://www.vuereal.com/technology">30,000</a> pixels </span>per inch, typically using <a href="https://www.nature.com/articles/s41377-020-0268-1">micro-LED technology</a>, such as Jade Bird Display in China and <a href="/tech-talk/semiconductors/optoelectronics/canadian-startup-vuereal-takes-on-apple-in-microled-displays">VueReal</a> in <span>Canada. In terms of how the new OLED display compares with those others, "our color purity is very high," </span>Brongersma says.</p> 
         1945 <p></p> 
         1946 <p>In the future, metasurfaces might also find use trapping light in applications such as solar cells and light sensors, Brongersma says.</p> 
         1947 <p></p> 
         1948 <p><span>The scientists detailed </span><span lang="zxx"><a href="https://science.sciencemag.org/cgi/doi/10.1126/science.abc8530"><span>their findings</span></a></span><span> online Oct. 22 in the journal <em>Science</em>.</span></p>]]></content:encoded>
         1949       <dc:creator>Charles Q. Choi</dc:creator>
         1950       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNzUyNw.jpeg" />
         1951       <media:content url="https://spectrum.ieee.org/image/MzcwNzUyNw.jpeg" />
         1952     </item>
         1953     <item>
         1954       <title>Use Your Bike as a Backup to Your Backup Power Supply</title>
         1955       <link>https://spectrum.ieee.org/geek-life/hands-on/use-your-bike-as-a-backup-to-your-backup-power-supply</link>
         1956       <description>Combining solar and pedal power should get you through most outages</description>
         1957       <category>geek-life</category>
         1958       <category>geek-life/hands-on</category>
         1959       <pubDate>Fri, 23 Oct 2020 15:00:00 GMT</pubDate>
         1960       <guid>https://spectrum.ieee.org/geek-life/hands-on/use-your-bike-as-a-backup-to-your-backup-power-supply</guid>
         1961       <content:encoded><![CDATA[<figure class="xlrg" role="img"> 
         1962  <img alt="Image of a bicycle, solar panel, and battery set up. " src="/image/MzcwNTQ2NA.jpeg"> 
         1963  <div class="ai"> 
         1964   <figcaption class="hi-cap">
         1965     Illustration: James Provost 
         1966   </figcaption> 
         1967  </div> 
         1968 </figure> 
         1969 <p><strong>As this article goes to</strong> press, <a href="https://en.wikipedia.org/wiki/Hurricane_Sally">Hurricane Delta is making landfall not far from where Hurricanes Sally</a> and<a href="https://en.wikipedia.org/wiki/Hurricane_Laura"> Laura</a> came ashore earlier in the season. I live on the East Coast of the United States, and fairly far inland, so such storms are not as frequent or intense as they are in states bordering on the Gulf of Mexico. But they are still a concern, if only because they can topple trees and cause widespread power outages. And ice storms during the winter here are also apt to bring down power lines.</p> 
         1970 <p>I don’t mind the resulting darkness so much. What I really don’t relish, though, is losing Internet access—especially now that it is my main connection to the world due to the pandemic. And the pandemic is reducing the number of field crews available to fix power lines, making outages last that much longer. So this year, I figured I’d get prepared for a blackout in the most self-sufficient way possible.</p> 
         1971 <p>I could, of course, just purchase a conventional small gasoline-powered generator. But I didn’t want to do that for a few reasons. In particular, I recalled stories about what went on after Hurricane Sandy in 2012, when many people using such backup generators had <a href="https://money.cnn.com/2012/11/02/news/economy/gas-shortage-sandy/index.html">trouble finding fuel</a>, <a href="https://grouper.ieee.org/groups/802/11/email/stds-802-11/msg00151.html">including the IEEE Operations Center</a>, in New Jersey.</p> 
         1972 <p>My first thought was to use photovoltaics, so I purchased two 100-watt panels on Amazon for less than US $1/W. I had a 35-ampere DC-to-DC converter from an earlier project to use as a charge controller, so my next step was to spec out a deep-cycle lead-acid battery to keep things going at night.</p> 
         1973 <p>Some experiments with a watt meter led me to conclude that a battery of at least 300 watt-hours capacity could keep four laptops, a cable modem, and a wireless router running for about 4 hours, while also charging the family’s phones and flashlights. That should get us through dark evenings. It would also suffice at a reduced load if the sun were hidden behind clouds all day. So I purchased a <a href="https://www.amazon.com/Sealed-Lead-Acid-Cycle-Battery/dp/B005CLPOQM">12-volt, 35-ampere-hour battery</a> (which nominally can store 420 Wh).</p> 
         1974 <p>But what if skies remained gray for many days in a row? Rather than trying to purchase enough battery storage to cover all reasonable eventualities, I decided that my backup source of electricity needed a backup itself, one that I could use to charge that battery during times when my photovoltaic panels won’t function. When necessary, I’d simply detach the battery from the panels and attach it to my backup source to be recharged. I briefly considered whether a wind turbine might serve that role, but then opted for something I figured would be more dependable: my two legs.</p> 
         1975 <figure class="xlrg" role="img"> 
         1976  <img alt="Illustration of the components of the bike battery." src="/image/MzcwNTQ3NA.jpeg"> 
         1977  <div class="ai"> 
         1978   <figcaption class="hi-cap">
         1979     Illustration: James Provost 
         1980   </figcaption> 
         1981   <figcaption> 
         1982    <strong>Parallel Power: </strong>My system primarily relies on photovoltaic panels to charge a deep-cycle lead-acid battery via a DC-DC converter. When there’s not enough sunlight to use the panels, I switch over to a generator driven by the back wheel of a conventional bike mounted in a stand. This requires a rectifier and a meter mounted on the handlebars to monitor the power produced and fed into the battery. 
         1983   </figcaption> 
         1984  </div> 
         1985 </figure> 
         1986 <p>I cycle regularly for about an hour a day, during which I probably put out an average of 80 W, based on some <a href="https://www.omnicalculator.com/sports/cycling-wattage">rough calculations</a> (I’m small and typically cycle around 22 kilometers per hour). That 80 Wh is only a fraction of what my solar panels can provide in a day, but it would be enough to keep my laptop connected to the Internet for a few hours, charge phones, and so forth. And pedaling my own power seemed like a healthy, stress reducing, activity to pass the time during a power outage.</p> 
         1987 <p>I discovered from one blogger that it wouldn’t be hard to <a href="https://genesgreenmachine.com/best-design-diy-bike-trainer-pedal-generator/">modify a stationary bike stand</a> to generate electrical power. Although I had a bike stand already, mine provides frictional drag using a fluid-filled chamber, which I was reluctant to crack open. Instead, I purchased one similar to the one that the blogger used, <a href="https://www.amazon.com/dp/B004I576SM/ref=as_li_ss_tl?coliid=IIA896P72CEZY&amp;colid=2DI66QDON6MB8&amp;psc=1&amp;ref_=lv_ov_lig_dp_it&amp;linkCode=sl1&amp;tag=genesgreenm0f-20&amp;linkId=51fa9a139813ae5806979de511e6988c&amp;language=en_US">which employs magnets and eddy currents</a> to create drag forces on a shaft that presses against the back wheel to increase exercise intensity.</p> 
         1988 <p>I ripped out all that drag-inducing stuff and attached a <a href="https://www.amazon.com/gp/product/B07QGNGLL4">brushless motor</a> to the shaft using a <a href="https://www.amazon.com/gp/product/B06X9X4CS7">flexible coupler</a> and a wooden spacer. Then I connected the three leads of the motor—originally intended to motorize a skateboard and now acting as a generator—to a three-phase <a href="https://www.amazon.com/Baomain-Heatsink-Shape-Bridge-Rectifier/dp/B01JKRIPUK/ref=as_li_ss_tl?_encoding=UTF8&amp;pd_rd_i=B01JKRIPUK&amp;pd_rd_r=5ac5179d-a540-40d3-a489-b4556bb78346&amp;pd_rd_w=YK42D&amp;pd_rd_wg=TnoZ8&amp;pf_rd_p=52b7592c-2dc9-4ac6-84d4-4bda6360045e&amp;pf_rd_r=TXPWAMX19BEWTVAGZ447&amp;psc=1&amp;refRID=TXPWAMX19BEWTVAGZ447&amp;linkCode=sl1&amp;tag=genesgreenm0f-20&amp;linkId=e649871a3ddd029cfa759a5092329e8b&amp;language=en_US">bridge rectifier</a>. The output of the rectifier in turn is connected to my battery through a <a href="https://www.amazon.com/gp/product/B017FSED9I/ref=as_li_ss_tl?ie=UTF8&amp;psc=1&amp;linkCode=sl1&amp;tag=genesgreenm0f-20&amp;linkId=a378088c1ce316c434b227b26e7c8a6e&amp;language=en_US">Drok meter</a>. This meter allows me to monitor the voltage, current, wattage, and total energy produced.</p> 
         1989 <p>Testing my power-producing bicycle stand quickly revealed a flaw in my logic. Pedaling at a comfortable pace, meaning one that I could keep up for a long time, produced only about 60 W, not 80. In retrospect, I decided that I had failed to consider the inefficiencies of power conversion, which surely are significant because my motor/generator gets pretty hot after a while. But even 60 Wh would do in a pinch. And there’s no rule that says I couldn’t cycle for longer than an hour. Even better, I can get my kids to contribute a little sweat to support their phone and computer use during the gray days of a winter power outage.</p> 
         1990 <p>Actually, generating power with their muscles provides a valuable lesson for kids, whether or not the power goes out. Every time they switch on a lightbulb or a television, they will think more about what this energy consumption means, given the considerable effort it takes to produce those watts yourself.</p> 
         1991 <p><em>This article appears in the November 2020 print issue as “Pedaling Out of the Dark.”</em></p>]]></content:encoded>
         1992       <dc:creator>David Schneider</dc:creator>
         1993       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNTQ0Mw.jpeg" />
         1994       <media:content url="https://spectrum.ieee.org/image/MzcwNTQ0Mw.jpeg" />
         1995     </item>
         1996     <item>
         1997       <title>New Sensor Integrated Within Dental Implants Monitors Bone Health</title>
         1998       <link>https://spectrum.ieee.org/the-human-os/biomedical/devices/new-sensor-integrated-within-dental-implants-monitors-bone-health</link>
         1999       <description>The device could one day mean less x-rays for people with dental implants</description>
         2000       <category>biomedical</category>
         2001       <category>biomedical/devices</category>
         2002       <pubDate>Fri, 23 Oct 2020 13:00:00 GMT</pubDate>
         2003       <guid>https://spectrum.ieee.org/the-human-os/biomedical/devices/new-sensor-integrated-within-dental-implants-monitors-bone-health</guid>
         2004       <content:encoded><![CDATA[<link href="/ns/interactive/0118race-to-5g/css/5g-logo-treatments.css" rel="stylesheet"> 
         2005 <div class="mobileHide"> 
         2006  <div class="imgWrapper offsetLeft lt sm"> 
         2007   <a href="/static/journal-watch"><img alt="Journal Watch report logo, link to report landing page" src="/image/MzI0MTAwOQ.jpeg"></a> 
         2008  </div> 
         2009 </div> 
         2010 <p>Scientists have created a new sensor that can be integrated within dental implants to passively monitor bone growth, bypassing the need for multiple x-rays of the jaw. The design is described in <a href="https://ieeexplore.ieee.org/document/9206139">study</a> published September 25 in <em><a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=7361">IEEE Sensors Journal</a></em>.</p> 
         2011 <p>Currently, x-rays are used to monitor jaw health following a dental implant. Dental x-rays typically involve low doses of radiation, but people with dental implants may require more frequent x-rays to monitor their bone health following surgery. And, as professor Alireza Hassanzadeh of Shahid Beheshti University, Tehran, notes, “Too many X-rays is not good for human health.”</p> 
         2012 <p>To reduce this need for x-rays, Hassanzadeh and two graduate students at <a href="http://en.sbu.ac.ir/SitePages/Home.aspx">Shahid Beheshti University</a> designed a new sensor that can be integrated within dental implants. It passively measures changes in the surrounding electrical field (capacitance) to monitor bone growth. Two designs, for short- and long-term monitoring, were created.</p> 
         2013 <p>The sensors are made of titanium and poly-ether-ether-ketone, and are integrated directly into a dental implant using microfabrication methods. The designs do not require any battery, and passively monitor changes in capacitance once the dental implant is in place.</p> 
         2014 <p>“When the bone is forming around the sensor, the capacitance of the sensor changes,” explains Hassanzadeh. This indicates how the surrounding bone growth changes over time. The changes in capacitance, and thus bone growth, are then conveyed to a reader device that transfers the measurements into a data logger.&nbsp;&nbsp;</p> 
         2015 <p>In their study, the researchers tested the sensors in the femur and jaw bone of a cow. “The results reveal that the amount of bone around the implant has a direct effect on the capacitance value of the sensor,” says Hassanzadeh.</p> 
         2016 <p>He&nbsp;says that the sensor still needs to be optimized for size and different implant shapes, and clinical experiments will need to be completed with different kinds of dental implant patients.&nbsp;“We plan to commercialize the device after some clinical tests and approval from FDA and authorities,” says Hassanzadeh.</p> 
         2017 <div class="mobileShow"> 
         2018  <h3 class="RptHdBackBarMobile"><span class="BackArrowBlkBkgrd">&lt;</span>&nbsp;<a href="/static/journal-watch">Back to IEEE Journal Watch</a></h3> 
         2019 </div>]]></content:encoded>
         2020       <dc:creator>Michelle Hampson</dc:creator>
         2021       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwMzE2Ng.jpeg" />
         2022       <media:content url="https://spectrum.ieee.org/image/MzcwMzE2Ng.jpeg" />
         2023     </item>
         2024     <item>
         2025       <title>The Lithium-Ion Battery With Built-In Fire Suppression</title>
         2026       <link>https://spectrum.ieee.org/tech-talk/energy/batteries-storage/liion-batteries-more-efficient-fireproof</link>
         2027       <description>New design also increases energy density for Li-ion batteries</description>
         2028       <category>energy</category>
         2029       <category>energy/batteries-storage</category>
         2030       <pubDate>Thu, 22 Oct 2020 20:51:00 GMT</pubDate>
         2031       <guid>https://spectrum.ieee.org/tech-talk/energy/batteries-storage/liion-batteries-more-efficient-fireproof</guid>
         2032       <content:encoded><![CDATA[<p>If there are superstars in battery research, you would be safe in identifying at least one of them as <a href="https://profiles.stanford.edu/yi-cui?releaseVersion=7.22.3">Yi Cui</a>, a scientist at Stanford University, whose research group over the years has introduced some <a href="/searchContent?query=Yi+Cui+&amp;max=10&amp;page=0&amp;sortby=relevance&amp;q=Yi+Cui+Li-ion">key breakthroughs in battery technology</a>.</p> 
         2033 <p>Now Cui and his research team, in collaboration with <a href="https://www6.slac.stanford.edu/">SLAC National Accelerator Laboratory</a>,&nbsp;have offered some exciting new capabilities for lithium-ion batteries based around a new polymer material they are using in the current collectors for them. The researchers claim this new design to current collectors increases efficiency in Li-ion batteries and reduces the risks of fires associated with these batteries.</p> 
         2034 <p>Current collectors are thin metal foils that distribute current to and from electrodes in batteries. Typically these metal foils are made from copper. Cui and his team redesigned these current collectors so that they are still largely made from copper but are now surrounded by a polymer.</p> 
         2035 <p>The Stanford team claim in their research published in the journal <em><a href="https://www.nature.com/articles/s41560-020-00702-8">Nature Energy</a> </em>that the polymer makes the current collector 80 percent lighter, leading to an increase in energy density from 16 to 26 percent. This is a significant boost over the average yearly increase of energy density for Li-ion batteries, which has been <a href="https://www.google.com/search?q=lithium+battery+energy+density+has+increased+by+percent&amp;espv=2&amp;source=lnms&amp;tbm=isch&amp;sa=X&amp;ved=0ahUKEwii36Teq9TSAhVY2WMKHbDyAUc4ChD8BQgHKAI&amp;biw=889&amp;bih=491">stuck at 5 percent a year</a> seemingly forever.</p> 
         2036 <figure class="xlrg" role="img"> 
         2037  <img alt="Scientists at Stanford and SLAC redesigned current conductors - thin metal foils that distribute current to and from electrodes - to make lithium-ion batteries lighter, safer and more efficient. They replaced the all-copper conductor, middle, with a layer of lightweight polymer coated in ultrathin copper (top right), and embedded fire retardant in the polymer layer to quench flames (bottom right). " src="/image/MzcwNzI1Mw.jpeg"> 
         2038  <figcaption class="hi-cap">
         2039    Image:&nbsp;Yusheng Ye/Stanford University 
         2040  </figcaption> 
         2041  <figcaption>
         2042    Scientists at Stanford and SLAC redesigned current conductors, thin metal foils that distribute current to and from electrodes, to make lithium-ion batteries lighter, safer and more efficient. They replaced the all-copper conductor, middle, with a layer of lightweight polymer coated in ultrathin copper (top right), and embedded fire retardant in the polymer layer to quench flames (bottom right).&nbsp; 
         2043  </figcaption> 
         2044 </figure> 
         2045 <p>This method of lightening the batteries is a bit of a novel approach to boosting energy density. Over the years we have seen many attempts to increase energy density by e<a href="/searchContent?q=nanostructured+silicon&amp;type=&amp;sortby=relevance">nlarging the surface area of electrodes through the use of new electrode materials</a>—such as nanostructured silicon&nbsp; in place of activated carbon. While increased surface area may increase charge capacity, energy density is calculated by the total energy over the total weight of the battery.</p> 
         2046 <p>The Stanford team have calculated the increase of 16 to 26 percent in the gravimetric energy density of their batteries by replacing the commercial&nbsp; copper/aluminum current collectors (8.06 mg/cm<sup>2</sup><span style="display:inline !important"> for copper and 5.0 mg/cm</span><sup><span>2</span></sup><span style="display:inline !important"> for aluminum) with&nbsp;their polymer collections current collectors (1.54 mg/cm</span><sup>2</sup><span style="display:inline !important"> for polymer-copper material and 1.05 mg/cm</span><sup>2 </sup>for polymer-aluminum).&nbsp;</p> 
         2047 <p>“Current collectors don’t contribute to the total energy but contribute to the total weight of battery,” explained <a href="https://profiles.stanford.edu/yusheng-ye">Yusheng Ye</a>, a researcher at Stanford and co-author of this research. “That’s why we call current collectors ‘dead weight’&nbsp;in batteries, in contrast to ‘active weight’&nbsp;of electrode materials.”</p> 
         2048 <aside class="inlay pullquote lt med offsetLeft">
         2049   Whenever the battery has combustion issues, our current collector will instantaneously release the fire retardant and extinguish the fire. 
         2050 </aside> 
         2051 <p>By reducing the weight of the current collector, the energy density can be increased, even when the total energy of the battery is almost unchanged. Despite the increased energy density offered by this research, it may not entirely alleviate so-called “range anxiety” associated with electric vehicles in which people have a fear of running out of power before reaching the next charge location. While the press release claims that this work will extend the range of electric vehicles, Ye noted that the specific energy improvement in this latest development is based on the battery itself. As a result, it is only likely to have around a 10% improvement in the range of an electric vehicle.</p> 
         2052 <p>“In order to improve the range from 400 miles to 600 miles, for example, more engineering work would need to be done taking into account the active parts of the batteries will need to be addressed together with our ultra-light current collectors,” said Ye.</p> 
         2053 <p>Beyond improved energy density efficiency, the polymer-based charge collectors are expected to help reduce the fires associated with Li-ion batteries. Of course, traditional copper current collectors don’t contribute to battery combustion on their own. The <a href="/energy/renewables/less-fire-more-power-the-secret-to-safer-lithiumion-batteries">combustion issues in Li-ion batteries</a>&nbsp; are related to the electrolyte and separator that are not used within the recommended temperatures and voltage windows.</p> 
         2054 <p>“One of the key innovations in our novel current collector is that we are able to embed fire retardant inside without sacrificing the energy density and mechanical strength of the current collector,” said Ye. “Whenever the battery has combustion issues, our current collector will instantaneously release the fire retardant and extinguish the fire. Such function cannot be achieved with traditional copper or aluminum current collector.”</p> 
         2055 <p>The researchers have patented the technology and are in discussions with battery manufacturers for commercialization. Cui and his team have already worked out some of the costs associated with adopting the polymer and they appear attractive. According to Ye, the cost of the polymer composite charge collector is around $1.3 per m<sup>2</sup>, which is a bit lower than the cost of copper foil, which is around $1.4 per m<sup>2</sup>. With these encouraging numbers, Ye added: “We are expecting industry to adopt this technology within the next few years.”</p>]]></content:encoded>
         2056       <dc:creator>Dexter Johnson</dc:creator>
         2057       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNzIxNg.jpeg" />
         2058       <media:content url="https://spectrum.ieee.org/image/MzcwNzIxNg.jpeg" />
         2059     </item>
         2060     <item>
         2061       <title>From Foldable Phones to Stretchy Screens</title>
         2062       <link>https://spectrum.ieee.org/consumer-electronics/portable-devices/from-foldable-phones-to-stretchy-screens</link>
         2063       <description>Today, you can buy a smartphone with a foldable display. Tomorrow you may wear a screen that can stretch</description>
         2064       <category>consumer-electronics</category>
         2065       <category>consumer-electronics/portable-devices</category>
         2066       <pubDate>Thu, 22 Oct 2020 18:00:00 GMT</pubDate>
         2067       <guid>https://spectrum.ieee.org/consumer-electronics/portable-devices/from-foldable-phones-to-stretchy-screens</guid>
         2068       <content:encoded><![CDATA[<figure class="xlrg" role="img"> 
         2069  <img alt="A image displaying the foldable aspects of the new phones." src="/image/MzcwNzAxMg.jpeg"> 
         2070  <figcaption class="hi-cap">
         2071    Photo-illustration: Dan Saelinger 
         2072  </figcaption> 
         2073 </figure> 
         2074 <p><strong><a href="https://www.motorola.com/us/">Motorola</a> demonstrated the very</strong> first handheld mobile phone almost a half century ago. It was the size of a brick and weighed half as much. That prototype spawned the first commercial mobile phone a decade later. It, too, was ungainly, but it allowed a person to walk around while sending and receiving phone calls, which at that point was a great novelty. Since then, mobile phones have acquired many other functions. They now have the ability to handle text messages, browse the Web, play music, take and display photos and videos, locate the owner on a map, and serve countless other uses—applications well beyond what anybody could have imagined when mobile phones were first introduced.</p> 
         2075 <p>But smartphones, nimble as they are, have struggled to overcome one seemingly fundamental drawback: Their displays are small. Sure, some phones have been made larger than normal to provide more real estate for the display. But make the phone too big and it outgrows the owner’s pocket, which is a nonstarter for many people.</p> 
         2076 <p>The obvious solution is to have the display fold up like a wallet. For years, developing suitable technology for that has been one of our goals as researchers at <a href="https://en.snu.ac.kr/index.html">Seoul National University</a>. It’s also been a goal for smartphone manufacturers, who just in the past year or two have been able to bring this technology to market.</p> 
         2077 <p>Soon, phones with foldable screens will no doubt proliferate. You or someone in your family will probably have one, at which point you’ll surely wonder: How in the world is it possible for the display to bend like that? We figured we’d explain what’s behind that technology to you here so that you’re ready when a phone with a large, bright, flexible display comes to a pocket near you—not to mention even more radical electronic devices that will be possible when their screens can stretch as well as bend.</p> 
         2078 <p><strong>Researchers have been seriously</strong> investigating how to make flexible displays for about two decades. But for years, they remained just that—research projects. In 2012, though, Bill Liu and some other Stanford engineering graduates set out to commercialize flexible displays by founding the <a href="https://www.royole.com/us">Royole Corp.</a> (which now has headquarters in both Fremont, Calif., and Shenzhen, China).</p> 
         2079 <figure class="xlrg" role="img"> 
         2080  <img alt="Person demonstrating the bendable display of the first commercial smartphone." src="/image/MzcwNzI5Nw.jpeg"> 
         2081  <figcaption class="hi-cap">
         2082    Photo: Robyn Beck/Getty Images 
         2083  </figcaption> 
         2084  <div class="ai"> 
         2085   <figcaption> 
         2086    <p><strong>A Closed Book:</strong> In late 2018, Royole Corp. developed the first commercial smartphone with a bendable display, the FlexPai. It folds closed with the screen still visible on the outside.</p> 
         2087   </figcaption> 
         2088  </div> 
         2089 </figure> 
         2090 <p>In late 2018, Royole introduced the <a href="https://www.royole.com/us/flexpai">FlexPai</a>, whose flexible display allows the device to unfold into something that resembles a tablet. The company demonstrated that this foldable display could withstand 200,000 bending cycles—and quite tight bends at that, with a radius of curvature of just 3 millimeters. But the FlexPai phone was more of a prototype than a mature product. <a href="https://www.theverge.com/2019/1/8/18174278/royole-flexpai-foldable-phone-android-ces-2019">A review</a> published in <em>The Verge</em>, for example, called it “charmingly awful.”</p> 
         2091 <p>Soon afterward, <a href="https://www.samsung.com/us/">Samsung</a> and <a href="https://www.huawei.com/us/">Huawei</a>, the world’s two largest smartphone makers, began offering their own foldable models. Samsung Mobile officially announced its <a href="https://www.samsung.com/us/mobile/galaxy-fold/">Galaxy Fold</a> in February 2019. It features dual foldable displays that can be bent with a radius of curvature as small as 1 mm, allowing the phone to fold up with the display on the inside. Huawei announced its first foldable smartphone, the <a href="https://en.wikipedia.org/wiki/Huawei_Mate_X">Mate X</a>, later that month. The Mate X is about 11 mm thick when folded, and its display (like that of the FlexPai) is on the outside, meaning that the bending radius of the display is roughly 5 mm. And in February of this year, each company introduced a second foldable model: Samsung’s <a href="https://www.samsung.com/us/mobile/galaxy-z-flip/">Galaxy Z Flip</a> and Huawei’s <a href="https://consumer.huawei.com/en/phones/mate-x-s/">Mate Xs</a>/5G.</p> 
         2092 <p>The most challenging part of engineering these phones was, of course, developing the display itself. The key was to reduce the thickness of the flexible display panel so as to minimize the bending stresses it has to endure when folded. The smartphone industry has just figured out how to do that, and panel suppliers such as <a href="https://www.samsungdisplay.com/eng/index.jsp">Samsung Display</a> and Beijing-based <a href="https://www.boe.com/en/about/gsjs/">BOE Technology Group</a> Co. are now mass-producing foldable displays.</p> 
         2093 <p>Like those found in conventional smartphones, these are all active-matrix organic light-emitting-diode (<a href="https://en.wikipedia.org/wiki/AMOLED">AMOLED</a>) displays. But instead of fabricating these AMOLEDs on a rigid glass substrate, as is normally done, these companies use a thin, flexible polymer. On top of that flexible substrate is the backplane—the layer containing the many thin-film transistors needed to control individual pixels. Those transistors incorporate a buffer layer that can prevent cracks from forming when the display is flexed.</p> 
         2094 <p>Although flexible displays constructed along these lines are fast becoming more common for phones and other consumer products, the standards that apply to these displays, as well as language for describing their ability to bend, are still, you might say, taking shape. These displays can be at least broadly characterized according to the radius of curvature they can withstand when flexed: “Conformable” refers to displays that don’t bend all that tightly, “rollable” refers to ones with intermediate levels of flexibility, and “foldable” describes those that can accommodate a very small radius of curvature.</p> 
         2095 <p>Because any material, be it a smartphone display or a steel plate, is in tension on the outside surface of a bend and in compression on the inside, the electronic components that make up a display must resist those stresses and the corresponding deformations they induce. And the easiest way to do that is by minimizing those shape-changing forces by bringing the outside surface of a flexed display closer to the inside surface, which is to say to make the device very thin.</p> 
         2096 <p>To make the display as thin as possible, designers omit the protective film and polarizer that normally go on top, along with the adhesive applied between these layers. While removing those elements is not ideal, both the protective film and antireflection polarizer are optional components for AMOLED displays, which generate light internally rather than modifying the amount of light transmitted from an LED backlight, as in liquid-crystal displays.</p> 
         2097 <p><strong>Another difference</strong> between flexible and conventional displays has to do with the transparent conductive electrodes that sandwich the light-emitting organic materials that make the pixels shine. Normally, a layer of indium tin oxide (ITO) fills this role. But ITO is quite brittle under tension, making it a bad choice for flexible displays. To make matters worse, ITO tends to adhere poorly to flexible polymer substrates, causing it to buckle and delaminate when compressed.</p> 
         2098 <p>Researchers battling this problem a decade ago found a few strategies for improving the adhesion between ITO and a flexible substrate. One is to treat the substrate with oxygen plasma before depositing the ITO electrode on top. Another is to insert a thin layer of metal (such as silver) between the electrode and the substrate. It also helps to place the top of the substrate in the exact middle of the layer cake that makes up the display. This arrangement puts the fragile interface with the ITO layer on the display’s mechanical neutral plane, which experiences neither compression nor tension when flexed. Currently, the leading electronics companies that make foldable displays are using this strategy.</p> 
         2099 <p>Even simpler, you can get rid of the ITO electrodes altogether. While that hasn’t been done yet in commercial devices, this strategy is attractive for reasons having nothing to do with the desire for flexibility. You see, indium is both toxic and expensive, so you really don’t want to use it if you don’t have to. Fortunately, over the years researchers, including the two of us, have come up with several other materials that could function as transparent electrodes for flexible displays.</p> 
         2100 <p>A flexible film that contains silver nanowires is probably the most promising candidate. These vanishingly tiny wires form a mesh that conducts electricity while remaining largely transparent. Such a layer can be prepared at low cost by applying a solution containing silver nanowires to the substrate in a manner similar to that of printing ink on newsprint.</p> 
         2101 <figure class="xlrg" role="img"> 
         2102  <img alt="Image displaying the flexible features of the Huawei Mate Xs smartphone." src="/image/MzcwNzE2MA.jpeg"> 
         2103  <div class="ai"> 
         2104   <figcaption class="hi-cap">
         2105     Photo: Tolga Akmen/Getty Images 
         2106   </figcaption> 
         2107   <figcaption> 
         2108    <strong>Into the Fold:</strong> In 2019, Huawei introduced a line of phones with flexible displays. Shown here is the the company's Mate Xs. 
         2109   </figcaption> 
         2110  </div> 
         2111 </figure> 
         2112 <p>Most of the research on silver nano­wires has been focused on finding ways to reduce the resistance of the junctions between individual wires. You can do that by adding certain other materials to the nanowire mesh, for example. Or you can physically treat the nanowire layer by heating it in an oven or by sending enough electricity through it to fuse the nanowire junctions through Joule heating. Or you can also treat it by hot-pressing it, subjecting it to a plasma, or irradiating it with a very bright flash to fuse the junctions. Which of these treatments is the best to use will depend in large part on the nature of the substrate onto which the nanowires are applied. A polymer substrate, such as polyethylene terephthalate (PET, the same material that many clear plastic food containers are made of), is prone to problematic amounts of deformation when heated. Polyimide is less sensitive to heat, but it has a yellowish color that can compromise the transparency of an electrode created in this way.</p> 
         2113 <p>But metal nanowires aren’t the only possible substitute for ITO when creating transparent conductive electrodes. Another one is graphene, a form of carbon in which the atoms are arranged in a two-dimensional honeycomb pattern. Graphene doesn’t quite match ITO’s superb conductivity and optical transparency, but it is better able to withstand bending than any other electrode material now being considered for flexible displays. And graphene’s somewhat lackluster electrical conductivity can be improved by combining it with a conducting polymer or by doping it with small amounts of nitric acid or gold chloride.</p> 
         2114 <p>Yet another possibility is to use a conductive polymer. The prime example is poly(3,4-ethylenedioxythiophene) polystyrene sulfonate—a mouthful that normally goes by the shorter name <a href="https://en.wikipedia.org/wiki/PEDOT:PSS">PEDOT:PSS</a>. Such polymers can be dissolved in water, which allows thin, transparent electrodes to be easily fabricated by printing or spin coating (an industrial process akin to making <a href="https://en.wikipedia.org/wiki/Spin_art">spin art</a>). The right chemical additives can significantly improve the ability of a film of this conductive polymer to bend or even stretch. Careful selection of additives can also boost the amount of light that displays emit for a given amount of current, making them brighter than displays fabricated using ITO electrodes.</p> 
         2115 <p><strong>Up to now, the organic LED&nbsp;displays</strong> used in mobile phones, computer monitors, and televisions have mainly been fabricated by putting the substrate under vacuum, evaporating whatever organic material you want to add to it, and then using metal masks to control where those substances are deposited. Think of it as a high-tech stenciling operation. Those metal masks with their very fine patterns are hard to fabricate, though, and much of the applied material is wasted, contributing to the high cost of large display panels.</p> 
         2116 <p>An interesting alternative, however, has emerged for fabricating such displays: inkjet printing. For that, the organic material you want to apply is dissolved in a solvent and then jetted onto the substrate where it is needed to form the many pixels, followed by a subsequent heating step to drive off any solvent that remains. <a href="https://www.dupont.com/">DuPont</a>, <a href="https://www.emdgroup.com/en">Merck</a>, <a href="https://www.nissanchem.co.jp/eng/">Nissan Chemical Corp.</a>, and <a href="https://www.sumitomocorp.com/en/jp">Sumitomo</a> are pursuing this tactic, even though the efficiency and reliability of the resulting devices still remain far lower than needed. But if one day these companies succeed, the cost of display fabrication should diminish considerably.</p> 
         2117 <figure class="stacked xlrg" role="img"> 
         2118  <img alt="Person demonstrating the flexible display of the Samsung Galaxy Fold smartphone." src="/image/MzcwNzEwNA.jpeg"> 
         2119  <figcaption class="hi-cap">
         2120    Photo: Jung Yeon-je/AFP/Getty Images 
         2121  </figcaption> 
         2122  <figcaption> 
         2123   <strong>Bent on Competing:</strong> Samsung also introduced its own lines of phones with flexible displays in 2019. Shown here is that company's Galaxy Fold. 
         2124  </figcaption> 
         2125 </figure> 
         2126 <p>For makers of small displays for smartphones, an even higher priority than keeping costs down is reducing power consumption. Organic LEDs (OLEDs) are becoming less power hungry, but the more mature the OLED industry becomes, the more difficult it will be to further trim power consumption from its current value of around 6 milliwatts per square centimeter (about 40 mW&nbsp;per square inch). And the diminishing returns here are especially problematic for foldable phones, which boast displays that are much larger than normal. So it’s probably a safe bet that your foldable phone, compact as it is, will have to contain an especially hefty battery, at least in the near term.</p> 
         2127 <p><strong>What’s next for</strong> flexible displays after they allow our smartphones to fold? Given how much people seem glued to their phones now, we anticipate that in the not-so-distant future, people will start wearing displays that attach directly to the skin. They’ll likely use these devices initially to visualize various kinds of biometric data, but other applications will no doubt emerge. And perhaps such wearable displays will one day be used just to make a high-tech fashion statement.</p> 
         2128 <p>The materials used to produce such a display should, of course, be soft enough not to be bothersome when attached to the skin. What’s more, they would have to be stretchable. Fabricating intrinsically stretchable conductors and semiconductors is an enormous challenge, though. So for several years researchers have been exploring the next-best thing: geometrically stretchable displays. These contain rigid but tiny electronic components attached to a stretchable substrate and connected by conductive pathways that can withstand the deformation that accompanies stretching.</p> 
         2129 <p>More recently, though, there’s been progress in developing intrinsically stretchable displays—ones in which the conductors and semiconductors as well as the substrate can all be stretched. Such displays require some novel materials, to be sure, but perhaps the greatest hurdle has been figuring out how to devise stretchable materials to encapsulate these devices and protect them from the damaging effects of moisture and oxygen. Our research team has recently made good progress in that regard, successfully developing air-stable, intrinsically stretchable light-emitting devices that do not require stretchable protective coatings. These devices can be stretched to almost twice their normal length without failing.</p> 
         2130 <p>Today, only very crude prototypes of stretchable displays have been fabricated, ones that provide just a coarse grid of luminous elements. But industry’s interest in stretchable displays is huge. This past June, South Korea’s Ministry of Trade, Industry and Energy assigned <a href="http://www.lgdisplay.com/eng/main">LG Display</a> to lead a consortium of industrial and academic researchers to develop stretchable displays.</p> 
         2131 <p>With just a little imagination, you can envision what’s coming down the road: athletes festooned with biometric displays attached to their arms or legs, smartphones we wear on the palms of our hands, displays that drape conformably over various curved surfaces. The people who are working hard now to develop such future displays will surely benefit from the many years of research that have already been done to create today’s foldable displays for smartphones. Without doubt, the era for not just bendable but also stretchable electronics will soon be here.</p> 
         2132 <p><em>This article appears in the November 2020 print issue as “Displays That Bend and Stretch.”</em></p> 
         2133 <h2>About the Authors</h2> 
         2134 <p><a href="https://www.linkedin.com/in/huanyu-zhou-0b6aa7155/">Huanyu Zhou</a> is studying for a doctorate at Seoul National University under the direction of <a href="https://eng.snu.ac.kr/node/14028">Tae-Woo Lee</a>, a professor of materials science and engineering there.</p>]]></content:encoded>
         2135       <dc:creator>Huanyu Zhou</dc:creator>
         2136       <dc:creator>Tae-Woo Lee</dc:creator>
         2137       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNjgzNQ.jpeg" />
         2138       <media:content url="https://spectrum.ieee.org/image/MzcwNjgzNQ.jpeg" />
         2139     </item>
         2140     <item>
         2141       <title>Intel Creating Cryptographic Codes That Quantum Computers Can't Crack</title>
         2142       <link>https://spectrum.ieee.org/tech-talk/computing/hardware/how-to-protect-the-internet-of-things-in-the-quantum-computing-era</link>
         2143       <description>Intel researchers developed a hardware accelerator that helps IoT devices use post-quantum cryptography</description>
         2144       <category>computing</category>
         2145       <category>computing/hardware</category>
         2146       <pubDate>Thu, 22 Oct 2020 17:44:00 GMT</pubDate>
         2147       <guid>https://spectrum.ieee.org/tech-talk/computing/hardware/how-to-protect-the-internet-of-things-in-the-quantum-computing-era</guid>
         2148       <content:encoded><![CDATA[<link href="/ns/interactive/0118race-to-5g/css/5g-logo-treatments.css" rel="stylesheet"> 
         2149 <div class="mobileHide"> 
         2150  <div class="imgWrapper offsetLeft lt sm"> 
         2151   <a href="/static/journal-watch"><img alt="Journal Watch report logo, link to report landing page" src="/image/MzI0MTAwOQ.jpeg"></a> 
         2152  </div> 
         2153 </div> 
         2154 <p>The world will need a new generation of cryptographic algorithms once quantum computing becomes powerful enough to crack the&nbsp;codes that protect everyone’s digital&nbsp;privacy.&nbsp;An Intel team has created an improved version of such a quantum-resistant cryptographic algorithm that could work more efficiently on the smart home and industrial&nbsp;devices making&nbsp;up the Internet of Things.</p> 
         2155 <p>The Bit-flipping Key Encapsulation (BIKE)&nbsp;provides a way to create a shared secret that encrypts sensitive information exchanged between two devices. The encryption process requires computationally complex operations involving mathematical problems that could strain the&nbsp;hardware of many Internet of Things (IoT) devices. But Intel researchers figured out how to create a&nbsp;hardware accelerator that enables&nbsp;the BIKE software to run efficiently on less powerful hardware.</p> 
         2156 <p>“Software execution of BIKE, especially on lightweight IoT devices, is latency and power intensive,” says&nbsp;<span><a href="https://www.researchgate.net/profile/Manoj_Sastry">Manoj Sastry</a>,&nbsp;principal engineer at Intel.</span>&nbsp;“The BIKE hardware accelerator proposed in this paper shows feasibility for IoT-class devices.”</p> 
         2157 <p>Intel has been working in cooperation with several other companies to develop&nbsp;BIKE as one possible quantum-resistant algorithm among the many being currently evaluated by the U.S. National Institute of Standards and Technology. This<span>&nbsp;latest&nbsp;version of BIKE developed primarily by the Intel team was&nbsp;<a href="https://eprint.iacr.org/2020/117.pdf">presented in a paper</a>&nbsp;[PDF] during the&nbsp;</span><a href="https://qce.quantum.ieee.org/"><span>IEEE International C</span><span>onference on Q</span><span>uantum C</span><span>omputing and E</span></a><span><a href="https://qce.quantum.ieee.org/">ngineering</a> on 13 October 2020.&nbsp;</span></p> 
         2158 <p>BIKE securely establishes a shared secret between two devices through&nbsp;a three-step process, says <a href="https://www.linkedin.com/in/santosh-ghosh-4b763621/">Santosh Ghosh</a>,&nbsp;a research scientist at Intel and coauthor on the paper. First, the host device creates&nbsp;a public-private key pair and sends&nbsp;the public key to the client. Second, the client sends an encrypted message using the public key to the host. And third, the host decodes the encrypted message through a BIKE decode procedure using the private key. “O<span>f these three steps, BIKE decode is the most compute intensive operation,” Ghosh explains.</span></p> 
         2159 <p>The improved version of BIKE takes advantage of&nbsp;a new decoder that requires less computing power. Testing showed that this enabled the computation of a single BIKE decode operation in 1.3 million cycles at 110 MHz on an <a href="https://www.intel.com/content/www/us/en/products/programmable/fpga/arria-10.html">Intel Arria 10 FPGA</a> in 12 milliseconds, which is fairly competitive compared to other options.</p> 
         2160 <p><span>“</span><span>BIKE is well suited for applications where IoT devices are used for encapsulation and a more capable device takes the role of host to generate the keys and perform the decapsulation procedure,</span><span>” Ghosh says.</span></p> 
         2161 <p>This also represents the first hardware implementation of BIKE suitable for Level 5 keys and ciphertexts, with level 5 representing the highest level of security as defined by the U.S. National Institute of Standards and Technology. Each&nbsp;higher level&nbsp;of security requires bigger keys and ciphertexts—the encrypted forms of data that would look unintelligible to prying eyes—which in turn require more compute-intensive operations.&nbsp;</p> 
         2162 <p>The team was previously focused on BIKE implementations suitable for the&nbsp;lower security levels of 1 and 3, which meant&nbsp;the public hardware implementation&nbsp;submitted to NIST as reference did not support level 5, says&nbsp;<a href="https://www.linkedin.com/in/rafael-misoczki-phd-24b33013/">Rafael Misoczki</a>, coauthor on the paper who was formerly at Intel and is now a cryptography engineer at Google.</p> 
         2163 <p>The latest hardware implementation for BIKE takes the security up a couple of notches.</p> 
         2164 <p>“Our BIKE decoder supports keys and ciphertexts for Level 5 which provides security equivalent to <a href="https://www.nist.gov/publications/advanced-encryption-standard-aes">AES-256</a>,” s<span>ays&nbsp;</span><span><a href="https://www.linkedin.com/in/andrew-reinders-ba2a54103/">Andrew Reinders</a>,&nbsp;security researcher at Intel and coauthor on the paper.</span>&nbsp;“This means a quantum computer needs 2<sup>128 </sup>operations to break it.”</p> 
         2165 <p>The latest version of the BIKE hardware accelerator has a design that&nbsp;offers additional security against side-channel attacks&nbsp;in which attackers attempt to&nbsp;exploit information about the power consumption or even timing of software processes. For example, <a href="/tech-talk/artificial-intelligence/machine-learning/how-prevent-ai-power-usage-secrets">differential power analysis</a> attacks can track the&nbsp;power consumption patterns associated with running certain computational tasks to reveal some of the underlying computations.</p> 
         2166 <p>In theory, such attacks against BIKE might attempt to&nbsp;target a small block of the secret being shared between&nbsp;two devices. That block size depends upon how many secret bits work together in underlying sub-operations, Reinders says. Because a BIKE block size is 1-bit, a BIKE hardware design that processed a single secret bit at a time would be highly vulnerable to such a differential power analysis&nbsp;attack.</p> 
         2167 <p>But the new BIKE hardware accelerator offers&nbsp;protection against such attacks because it performs all the computations for the 128-bits of secret in parallel, which makes it difficult to single out power consumption patterns associated with individual computations.</p> 
         2168 <p>The BIKE hardware accelerator also has protection against timing attacks, because its decoder always runs for a fixed number of rounds that are each the same amount of time.</p> 
         2169 <p>BIKE was previously chosen as one of eight alternates in the third round of NIST’s&nbsp;<span><a href="https://www.nist.gov/news-events/news/2020/07/nists-post-quantum-cryptography-program-enters-selection-round">Post-Quantum Cryptography Standardization Process</a> that is currently&nbsp;<a href="/tech-talk/telecom/security/how-the-us-is-preparing-for-quantum-computings-threat-to-end-secrecy">narrowing down the best candidates</a> to replace modern cryptography standards. Seven other algorithms were selected as finalists, making for a total of 15 algorithms remaining under consideration from a starting group of 69 submissions.</span></p> 
         2170 <p>The researchers have&nbsp;submitted their revised version of BIKE for the&nbsp;third round of the NIST challenge and are currently awaiting comments. If all goes well, the federal agency plans to release the initial standard for quantum-resistant cryptography in 2022.</p> 
         2171 <div class="mobileShow"> 
         2172  <h3 class="RptHdBackBarMobile"><span class="BackArrowBlkBkgrd">&lt;</span>&nbsp;<a href="/static/journal-watch">Back to IEEE Journal Watch</a></h3> 
         2173 </div>]]></content:encoded>
         2174       <dc:creator>Jeremy Hsu</dc:creator>
         2175       <media:thumbnail url="https://spectrum.ieee.org/image/MzcwNzI3MQ.jpeg" />
         2176       <media:content url="https://spectrum.ieee.org/image/MzcwNzI3MQ.jpeg" />
         2177     </item>
         2178   </channel>
         2179 </rss>