Life. It’s the one thing that, so far, makes Earth unique among the thousands of other planets we’ve discovered. Since the fall of 1997, NASA satellites have continuously and globally observed all plant life at the surface of the land and ocean. During the week of Nov. 13-17, we are sharing stories and videos about how this view of life from space is furthering knowledge of our home planet and the search for life on other worlds.
Earth is the only planet with life, as far as we know. From bacteria in the crevices of the deepest oceans to monkeys swinging between trees, Earth hosts life in all different sizes, shapes and colors. Scientists often study Earth from the ground, but some also look to our satellites to understand how life waxes and wanes on our planet.
Over the years, scientists have used this aerial view to study changes in animal habitats, track disease outbreaks, monitor forests and even help discover a new species. While this list is far from comprehensive, these visual stories of bacteria, plants, land animals, sea creatures and birds show what a view from space can reveal.
Known as the grass of the ocean, phytoplankton are one of the most abundant types of life in the ocean. Usually single-celled, these plant-like organisms are the base of the marine food chain. They are also responsible for the only long-term transfer of carbon dioxide from Earth’s atmosphere to the ocean.
Even small changes in phytoplankton populations can affect carbon dioxide concentrations in the atmosphere, which could ultimately affect Earth’s global surface temperatures. Scientists have been observing global phytoplankton populations continuously since 1997 starting with the Sea-Viewing Wide Field-of View Sensor (SeaWiFS). They continue to study the small life-forms by satellite, ships and aircrafts.
Found on the surface of zooplankton and in contaminated water, the bacteria that cause the infectious disease cholera — Vibrio cholerae — affect millions of people every year with severe diarrhea, sometimes leading to death. While our satellite sensors can’t detect the actual bacteria, scientists use various satellite data to look for the environmental conditions that the bacteria thrive in.
Specifically, microbiologist Rita Colwell at the University of Maryland, College Park, and West Virginia University hydrologist Antar Jutla studied data showing air and ocean temperature, salinity, precipitation, and chlorophyllconcentrations, the latter a marker for zooplankton. Anticipating where the bacteria will bloom helps researchers to mitigate outbreaks.
Recently, Colwell and Jutla have been able to estimate cholera risk after major events, such as severe storms, by looking at satellite precipitation data, air temperature, and population maps. The two maps above show the team’s predicted cholera risk in Haiti two weeks after Hurricane Matthew hit over October 1-2, 2016 and the actual reported cholera cases in October 2016.
From helping preserve forests for chimpanzees to predicting deer population patterns, scientists use our satellites to study wildlife across the world. Satellites can also see the impacts of perhaps the most relatable animal to us: humans. Every day, we impact our planet in many ways including driving cars, constructing buildings and farming – all of which we can see with satellites.
Our Black Marble image provides a unique view of human activity. Looking at trends in our lights at night, scientists can study how cities develop over time, how lighting and activity changes during certain seasons and holidays, and even aid emergency responders during power outages caused by natural disasters.
Scientists use our satellite data to study birds in a variety of ways, from understanding their migratory patterns, to spotting potential nests, to tracking populations. In a rather creative application, scientists used satellite imagery to track Antarctica’s emperor penguin populations by looking for their guano – or excrement.
Counting emperor penguins from the ground perspective is challenging because they breed in some of the most remote and cold places in the world, and in colonies too large to easily count manually. With their black and white coats, emperor penguins are also difficult to count from an aerial view as they sometimes blend in with shadows on the ice. Instead, Phil Trathan and his colleagues at the British Antarctic Survey looked through Landsat imagery for brown stains on the sea ice. By looking for penguin droppings, Trathan said his team identified 54 emperor penguin colonies along the Antarctic coast.
Just as we see plants grow and wilt on the ground, satellites observe the changes from space. Flourishing vegetation can indicate a lively ecosystem while changes in greenery can sometimes reveal natural disasters, droughts or even agricultural practices. While satellites can observe plant life in our backyards, scientists can also use them to provide a global picture.
Using data from satellites including SeaWiFS, and instruments including the NASA/NOAA Visible Infrared Imaging Radiometer Suite and the Moderate Resolution Imaging Spectroradiometer, scientists have the most complete view of global biology to date, covering all of the plant life on land and at the surface of the ocean.
Our satellites have helped scientists study creatures living in the oceans whether it’s finding suitable waters for oysters or protecting the endangered blue whale. Scientists also use the data to learn more about one of the most vulnerable ecosystems on the planet – coral reefs.
They may look like rocks or plants on the seafloor, but corals are very much living animals. Receiving sustenance from photosynthetic plankton living within their calcium carbonate structures, coral reefs provide food and shelter for many kinds of marine life, protect shorelines from storms and waves, serve as a source for potential medicines, and operate as some of the most diverse ecosystems on the planet.
However, coral reefs are vulnerable to the warming of the ocean and human activity. Our satellites measure the surface temperature of ocean waters. These measurements have revealed rising water temperatures surrounding coral reef systems around the world, which causes a phenomenon known as “coral bleaching.” To add to the satellite data, scientists use measurements gathered by scuba divers as well as instruments flown on planes.
During the week of Nov. 13-17, check out our stories and videos about how this view of life from space is furthering knowledge of our home planet and the search for life on other worlds. Follow at www.nasa.gov/Earth.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
Verizon Cancels Elderly Woman’s Service on Her 84th Birthday http://ift.tt/2vePM37
Computational Fabrication research from the Interactive Geometry Lab can turn 3D model files into objects with textiles, connecting parts and forming shape using zip fasteners:
Fabrication from developable parts is the basis for arts such as papercraft and needlework, as well as modern architecture and CAD in general, and it has inspired much research. We observe that the assembly of complex 3D shapes created by existing methods often requires first fabricating many small parts and then carefully following instructions to assemble them together. Despite its significance, this error prone and tedious process is generally neglected in the discussion. We present the concept of zippables – single, two dimensional, branching, ribbon-like pieces of fabric that can be quickly zipped up without any instructions to form 3D objects. Our inspiration comes from the so-called zipit bags (just-zipit.com), which are made of a single, long ribbon with a zipper around its boundary. In order to assemble the bag, one simply needs to zip up the ribbon. Our method operates in the same fashion, but it can be used to approximate a wide variety of shapes. Given a 3D model, our algorithm produces plans for a single 2D shape that can be laser cut in few parts from fabric or paper. A zipper can then be attached along the boundary by sewing, or by gluing using a custom-built fastening rig. We show physical and virtual results that demonstrate the capabilities of our method and the ease with which shapes can be assembled.
More Here
A robot communicates through an interface.
Saturn 3 (1980)
Latest project from @mario-klingemann employs Neural Networks trained on a collection of archive footage to recreate videos using the dataset.
It is confirmed that no human intervention has occured in the processed output, and it is interesting where there are convincing connections between the two (and where there apparently are none):
Destiny Pictures, Alternative Neural Edit, Side by Side Version
This movie has been automatically collaged by an neural algorithm using the movie that Donald Trump’s gave as a present to Kim Jong Un as the template, replacing all scenes with visually similar scenes from public domain movies found in the internet archive.
Neural Remake of “Take On Me” by A-Ha
An AI automatically detects the scenes in the source video clip and then replaces them with similar looking archival footage. The process is fully automatic, there are no manual edits.
Neural Reinterpretation of “Sabotage” by the Beastie Boys
An AI automatically detects the scenes in the source video clip and then replaces them with similar looking archival footage.
There are other video examples at Mario’s YouTube page (but some may not be viewable due to music copyright.
If you follow Mario’s Twitter timeline, you can get updated with the latest examples, and follow the evolution of the project [link]
Micropayments might not top your list of most compelling inventions, but they’re a sought-after capability. Small payments of less than a dollar, or even less than a cent, have the potential to shake up old, established business models, and open up new doors for the Internet of Everything.
Small digital payments have been tried again and again—in fact, Web inventor Tim Berners-Lee tried to embed micropayment capability into the original World Wide Web, but without success. So far, inherent transaction costs have been an unsurpassable hurdle.
Some argue that digital payment methods like bitcoin are the way forward.
继续阅读
Apple have just published an example for developers on how to use their front facing camera on the iPhone X for AR apps:
This sample app presents a simple interface allowing you to choose between four augmented reality (AR) visualizations on devices with a TrueDepth front-facing camera (see iOS Device Compatibility Reference).
The camera view alone, without any AR content.
The face mesh provided by ARKit, with automatic estimation of the real-world directional lighting environment.
Virtual 3D content that appears to attach to (and be obscured by parts of) the user’s real face.
A simple robot character whose facial expression is animated to match that of the user.
Link
An intro video can be found here
Bitsquare, decentralised #bitcoin exchange
Holographic Cortana Appliance