HOVER BONES
Plus check out Glitch Black’s music on Bandcamp!
Bitcoin mining
Release from IBM Research is a Computer Vision dataset for a realtime gesture recognition system, notable for the minimal representation visualizations:
This dataset was used to build the real-time, gesture recognition system described in the CVPR 2017 paper titled “A Low Power, Fully Event-Based Gesture Recognition System.” The data was recorded using a DVS128. The dataset contains 11 hand gestures from 29 subjects under 3 illumination conditions and is released under a Creative Commons Attribution 4.0 license.
More Here
EDIT - Here is a brief video explanation:
https://github.com/yahoo/samoa
Machine learning and data mining are well established techniques in the world of IT and especially among web companies and startups. Spam detection, personalization and recommendations are just a few of the applications made possible by mining the huge quantity of data available nowadays. However, “big data” is not only about Volume, but also about Velocity (and Variety, 3V of big data).
The usual pipeline for modeling data (what “data scientists” do) involves taking a sample from production data, cleaning and preprocessing it to make it usable, training a model for the task at hand and finally deploying it to production. The final output of this process is a pipeline that needs to run periodically (and be maintained) in order to keep the model up to date. Hadoop and its ecosystem (e.g., Mahout) have proven to be an extremely successful platform to support this process at web scale.
However, no solution is perfect and big data is “data whose characteristics forces us to look beyond the traditional methods that are prevalent at the time”. The current challenge is to move towards analyzing data as soon as it arrives into the system, nearly in real-time.
For example, models for mail spam detection get outdated with time and need to be retrained with new data. New data (i.e., spam reports) comes in continuously and the model starts being outdated the moment it is deployed: all the new data is sitting without creating any value until the next model update. On the contrary, incorporating new data as soon as it arrives is what the “Velocity” in big data is about. In this case, Hadoop is not the ideal tool to cope with streams of fast changing data.
Distributed stream processing engines are emerging as the platform of choice to handle this use case. Examples of these platforms are Storm, S4, and recently Samza. These platforms join the scalability of distributed processing with the fast response of stream processing. Yahoo has already adopted Storm as a key technology for low-latency big data processing.
Alas, currently there is no common solution for mining big data streams, that is, for doing machine learning on streams on a distributed environment.
SAMOA (Scalable Advanced Massive Online Analysis) is a framework for mining big data streams. As most of the big data ecosystem, it is written in Java. It features a pluggable architecture that allows it to run on several distributed stream processing engines such as Storm and S4. SAMOA includes distributed algorithms for the most common machine learning tasks such as classification and clustering. For a simple analogy, you can think of SAMOA as Mahout for streaming.
SAMOA is both a platform and a library. As a platform, it allows the algorithm developer to abstract from the underlying execution engine, and therefore reuse their code to run on different engines. It also allows to easily write plug-in modules to port SAMOA to different execution engines.
As a library, SAMOA contains state-of-the-art implementations of algorithms for distributed machine learning on streams. The first alpha release allows classification and clustering.
For classification, we implemented a Vertical Hoeffding Tree (VHT), a distributed streaming version of decision trees tailored for sparse data (e.g., text). For clustering, we included a distributed algorithm based on CluStream. The library also includes meta-algorithms such as bagging.
An algorithm in SAMOA is represented by a series of nodes communicating via messages along streams that connect pairs of nodes (a graph). Borrowing the terminology from Storm, this is called a Topology. Each node in the Topology is a Processor that sends messages to a Stream. The user code that implements the algorithm resides inside a Processor. Figure 3 shows an example of a Processor joining two stream from two source Processors. Here is a code snippet to build such a topology in SAMOA.
TopologyBuilder builder; Processor sourceOne = new SourceProcessor(); builder.addProcessor(sourceOne); Stream streamOne = builder.createStream(sourceOne); Processor sourceTwo = new SourceProcessor(); builder.addProcessor(sourceTwo); Stream streamTwo = builder.createStream(sourceTwo); Processor join = new JoinProcessor(); builder.addProcessor(join).connectInputShuffle(streamOne).connectInputKey(streamTwo);
1. Download SAMOA
git clone git@github.com:yahoo/samoa.git cd samoa mvn -Pstorm package
2. Download the Forest CoverType dataset.
wget "http://downloads.sourceforge.net/project/moa-datastream/Datasets/Classification/covtypeNorm.arff.zip" unzip covtypeNorm.arff.zip
Forest CoverType contains the forest cover type for 30 x 30 meter cells obtained from US Forest Service (USFS) Region 2 Resource Information System (RIS) data. It contains 581,012 instances and 54 attributes, and it has been used in several papers on data stream classification.
3. Download a simple logging library.
wget "http://repo1.maven.org/maven2/org/slf4j/slf4j-simple/1.7.2/slf4j-simple-1.7.2.jar"
4. Run an Example. Classifying the CoverType dataset with the VerticalHoeffdingTree in local mode.
java -cp slf4j-simple-1.7.2.jar:target/SAMOA-Storm-0.0.1.jar com.yahoo.labs.samoa.DoTask "PrequentialEvaluation -l classifiers.trees.VerticalHoeffdingTree -s (ArffFileStream -f covtypeNorm.arff) -f 100000"
The output will be a sequence of the evaluation metrics for accuracy, taken every 100,000 instances.
To run the example on Storm, please refer to the instructions on the wiki.
For more information about SAMOA, see the README and the wiki on github, or post a question on the mailing list.
SAMOA is licensed under an Apache Software License v2.0. You are welcome to contribute to the project! SAMOA accepts contributions under an Apache style contributor license agreement.
Good luck! We hope you find SAMOA useful. We will continue developing the framework by adding new algorithms and platforms.
Gianmarco De Francisci Morales (gdfm@yahoo-inc.com) and Albert Bifet (abifet@yahoo.com) @ Yahoo Labs Barcelona
Augmented Reality app from Nexus Studios is offers geolocation wayfinder service with a virtual guide in the form of a half-naked gentleman:
HotStepper is your first Augmented Reality sidekick to any destination on Earth. HotStepper features a confident dude who, when he’s not dancing, will walk you to any location you need to go. All you need to do is go outside, pick a destination on the map and then just follow him as he does his thing.
More Here
Finding your friends at a festival | by David Urbina for @neonapp. Get notified when the app is released. Music: Seven Lions x Illenium x Said The Sky.
the age of the really useful apps is starting
— Nil (@niluspc)
August 16, 2017
This is rad. Hope it shows up at some festivals soon https://t.co/c9a1W7auEe
— Goldroom (@goldroom)
August 16, 2017
One of the best uses for AR I’ve seen. https://t.co/kxGAUzVyEf
— Alexander Danling (@baobame)
August 15, 2017
Seeing more practical and indispensable use-cases for AR than I have for new apps in quite a while. pic.twitter.com/zwHEGkYZrK via @ARKitweekly
— Scott Belsky (@scottbelsky)
August 15, 2017
Reasons like this are why I think AR >> VR https://t.co/7rt5pRT3o6
— Mohammad Al Azzouni (@mazzouni)
August 17, 2017
I need this in my life! https://t.co/yGbGrWYLBD
— Stefan Goodchild ⚛ (@stefangoodchild)
August 15, 2017
ARKit really will bring a new wave of useful functionality to the phone. https://t.co/H6TT1SlFkj
— CM Harrington (@octothorpe)
August 15, 2017
I love this. Good example of AR solving a REAL problem 👏 https://t.co/6wx3RSwSag
— Sam Clarke (@sclarke111)
August 17, 2017
ARKit is going to empower so many awesome apps when iOS 11 ships. https://t.co/MUaTqbDUb1
— Matt Sayward (@mattsayward)
August 15, 2017
By far the most functional implementation of AR I’ve ever seen. https://t.co/cWC3ymxq9z
— Thomas Claessens (@DeClaessens)
August 16, 2017
This looks mighty useful https://t.co/vh3vTjuVLO
— Max Böck (@mxbck)
August 15, 2017
Impressive (and actually useful) https://t.co/VHdlXzAdGY
— Dominik Schmidt (@sluderndotcom)
August 16, 2017
This is such a good idea! https://t.co/X7xhgB7xeT
— Donna Lowe (@reloweeda)
August 15, 2017
👍🏽 would be super handy https://t.co/9Tk2Q16qnE
— Simon (@liquidmedia2013)
August 15, 2017
Genuinely useful AR coming to a field near you. https://t.co/4M8b92UJLk
— Cennydd (@Cennydd)
August 16, 2017
Find you festival friends with AR - Definitely the coolest implementation I’ve seen so far. App revolution 2.0 on its way. https://t.co/dKDkPRbMw1
— Tom Austin (@tomhaustin)
August 15, 2017
I can’t wait to try his app 😱 https://t.co/YHkZ9F91Zn
— Alexandre Mouriec (@mrcalexandre)
August 15, 2017
This is magical. ARKit demos by the app developers have been 👌🏻. Can’t wait to play with these apps. https://t.co/0RmQ7kkCiE
— KietChieng (@KietChieng)
August 16, 2017
GIMME THAT GIMME THAT RIGHT NOW https://t.co/Hg6fO6GWOq
— Valentin (@valdecarpentrie)
August 15, 2017
This is something I need https://t.co/iVjEkRxCaJ
— Andrew Rodebaugh (@andrewrodebaugh)
August 15, 2017
Less lost folks wandering the festival grounds aimlessly… Love some functional AR! https://t.co/deXJ8nMFQu
— Kent Weber (@WeberKent)
August 15, 2017
Again. This will be a game changer https://t.co/YiN2LQvmU5
— Jens@Gamescom (@JensHerforth)
August 15, 2017
This is a pretty cool use of GPS+ARKit, awesome demo use case! 🛳-it! #ARKit #MapKit #iOS11 https://t.co/LmMjPfo7KW
— Benjamin Hendricks (@benjhendricks)
August 15, 2017
The practical uses of #AR are incredible… this kind of thing will be the norm in the next few years & I can’t wait to test it. #Innovation https://t.co/XdkAdEG11G
— Josh Worth (@JoshWorthh)
August 15, 2017
OMG best use of the #ARKit. At festivals, i spend half my time looking for my friends in the crowd… https://t.co/YPb0AfAFjn
— Julie Tonna (@julie_tonna)
August 15, 2017
Awesome! This would also be cool for something like @ingress / @PokemonGoApp. Ps: love that new iPhone design 😉
— Marcel (@marceldk)
August 15, 2017
OMG !!!!!!!! #Devslopes https://t.co/MzN5RKn1DI
— leonyuon (@leonyuonl)
August 15, 2017
This is amazing! https://t.co/ZrpQBEgaU3
— Shane Griffiths (@shanegriffiths)
August 15, 2017
i just cant stop getting excited by these ARKit demos 🌟 https://t.co/IXAM6N0VBf
— nikhil srinivasan 👾 (@nvs)
August 15, 2017
Just think how much more enjoyable festivals would have been if you weren’t constantly losing/looking for everyone. https://t.co/uzxNJMqI4c
— Neil Cooper (@ncooperdesign)
August 15, 2017
Future killer Jazz Fest/Mardi Gras app for iPhone. (and really every other large gathering where you wanna find your friends) https://t.co/RXkVrLOuQB
— Stephen Sullivan (@swgs)
August 15, 2017
💯 arkit is legit 💯 https://t.co/8h3gWtdMtE
— Sean PJPGR Doran (@spjpgrd)
August 15, 2017
Another cool use of #ARKit https://t.co/0QUrN4BgJF
— Matt Zarandi ⚡️ (@MattZarandi)
August 15, 2017
Now this is something genuinely useful for AR https://t.co/7CvykUc2SQ
— Joel (@joevo2)
August 16, 2017
#musthave https://t.co/4KIhkWghKD
— Gee 🔥 (@Georg_Schmo)
August 15, 2017
This would have come in so handy on many occasions. https://t.co/2jI7uQn1Lf
— Steven Lin (@Stevenchlin)
August 15, 2017
Another great usecase! https://t.co/T5ggr8Qyez
— Schlabbeschambes (@DerHurly)
August 15, 2017
AR is gonna be so cool https://t.co/qmlxshUk03
— Beans (@beano629)
August 15, 2017
This is pretty brilliant! https://t.co/TevMmjBLKE
— Vlad Vukicevic (@vvuk)
August 15, 2017
A 🔥use case here ⬇️ just amazing #ARKit https://t.co/elPyWbW4iO
— Glenville Morris (@glenvillemorris)
August 15, 2017
Now thats a smart techcombi https://t.co/wH8ECU7VxO
— thefirstfloor (@jeroenduhmooij)
August 15, 2017
We gonna be livin’ in 2025 real soon. https://t.co/RgXCAjdb2t
— David Bird (@David_Burns_Red)
August 15, 2017
here’s another super rad use case that would also work for finding your Lyft / Uber driver https://t.co/JVm3oqGrW9
— TIFFANY ZHONG (@TZhongg)
August 15, 2017
Great usage of ARKit! https://t.co/jJ1VDOX4zb
— Elliot Turner (@eturner303)
August 15, 2017
#ARKit (demo) with a practical concept to navigate space and impact social engagement #AR #interactivetech #socialAR https://t.co/2352xf9haz
— Melody Koebler (@melabyyte)
August 15, 2017
Well, that’s bloody awesome https://t.co/XvCLwNsqJB
— Neil Kleiner (@nkleiner)
August 15, 2017
Handy real-world application for #AR. Beats “we’re to the left of the stage” https://t.co/zoMbK4dUSm
— Jon Williams (@yesthatjon)
August 15, 2017
Now THIS is awesome › https://t.co/xP6LamQuua #ARKit
— Jermaine (@dviate)
August 15, 2017
Neat idea. Is it just me or does it feel like it wants a giant column of light like in an MMO or something? https://t.co/SM2dKw80wT
— Gabe Weiss (@GabeWeiss_)
August 15, 2017
Yes and yes! And not just for finding people you already know, opt-in real-time people discovery in the offline world has massive potential https://t.co/zsAQy0q55z
— Shuvi👩🏻💻 (@shuvi)
August 15, 2017
Find my friends on a whole new level #ARKit https://t.co/l53rkXr4PS
— Spencer Bratman (@SpencerBratman)
August 15, 2017
Eyyy this is what I’m talkin about—next to disrupt social media? https://t.co/eN2BSvYXNh
— Kenneth Ng (@KennethLNg)
August 16, 2017
Well this is awesomely handy. https://t.co/KmU4FJvErV
— Dan Z (@danactual)
August 16, 2017
Stop this is amazing!! https://t.co/ZcTy1iAlVt
— Daniel Feodoroff (@mrdanielfeo)
August 16, 2017
Clever! https://t.co/SnjqQD8gL9
— geoff brown (@cgeoffreybrown)
August 16, 2017
Looking forward to way more of this … https://t.co/Qdx0fMK3sh
— Neil Voss (@neilvoss)
August 16, 2017
Just watch this video, one of the best uses of AR I’ve seen https://t.co/OZFjwiIKLP
— Ben King (@kngbn79)
August 16, 2017
AR tinder is gonna be wicked
— Utkarsh Gupta (@u7karsh)
August 16, 2017
Now this is cool! #arkit #ar #AugmentedReality https://t.co/s7E4jkqkpN
— Jen Abel 💬💫 (@jjen_abel)
August 17, 2017
i’ve been waiting for an app like this for a while https://t.co/0uaEwKgtm9
— ✨🌵🦊 🌴✨ (@ryanrogalski)
August 17, 2017
See All Videos
So, our physics teacher has the strange idea of motivating his students by letting each of us present a physical phenomenal we find interesting to our classmates in a 5-minutes-presentation. And now I need something that is interesting for everyone - even people that usually don't care for physics -, but has interesting facts for someone who's interested in it, too (preferably with an easy experiment). You don't happen to have any ideas, do you?
First of all, your professor is awesome for taking the time to do this. Of the top of my mind, the best one I have is Chladni figures.
Basically take a flat metal plate, fix it at the center and spray some fine sand particles on it.
Using a violin bow, gently excite any edge of the plate to magically witness these beautiful normal mode patterns ( known as Chladni patterns/figures ) forming on the plate.
Also notice that by pinching the plate at different points, the pattern obtained changes.
There is a whole lot of physics that goes behind such a simple phenomenon and I dare say we understand it completely. There are lots of questions on these figures that we have no answer for!
Hope this helps with your presentation. Have a good one!
Gif source video: Steve Mould
Design project by Leslie Nooteboom is a lamp that can project artificial natural lighting onto walls, created with high-rise apartment spaces in mind:
komorebi is sunlight filtering through leaves, creating a dance of light and shadows where filtered sunrays hit a surface. It is the reflections on pavements underneath centuries-old trees on a sunny day, and moving, framed lightboxes through windows of homes onto walls. However, these days buildings are taller than they have ever been, creating a place to live for as many people as possible on the tiniest piece of land possible. Homes become a place of isolation from the outside – windows are absent or so tiny that even the idea of nature disappears, and lighting has become so artificial that there is no sense of day, time or place anymore.
komorebi lets you curate natural lighting experiences indoors.
In a time where indoor sunlight is becoming more scarce, the need for technological nature is increasing. With an ever growing global population and urbanisation levels reaching huge rates, fewer living spaces are able to receive direct sunlight. There are attempts at solving this issue, however these are very static. Intensity and colour seem to be the only way in which their light is dynamic.
You can find out more at Creative Applications here or the project page here
Today, we’re celebrating the Red Planet! Since our first close-up picture of Mars in 1965, spacecraft voyages to the Red Planet have revealed a world strangely familiar, yet different enough to challenge our perceptions of what makes a planet work.
You’d think Mars would be easier to understand. Like Earth, Mars has polar ice caps and clouds in its atmosphere, seasonal weather patterns, volcanoes, canyons and other recognizable features. However, conditions on Mars vary wildly from what we know on our own planet.
Viking Landers
Our Viking Project found a place in history when it became the first U.S. mission to land a spacecraft safely on the surface of Mars and return images of the surface. Two identical spacecraft, each consisting of a lander and an orbiter, were built. Each orbiter-lander pair flew together and entered Mars orbit; the landers then separated and descended to the planet’s surface.
Besides taking photographs and collecting other science data, the two landers conducted three biology experiments designed to look for possible signs of life.
Pathfinder Rover
In 1997, Pathfinder was the first-ever robotic rover to land on the surface of Mars. It was designed as a technology demonstration of a new way to deliver an instrumented lander to the surface of a planet. Mars Pathfinder used an innovative method of directly entering the Martian atmosphere, assisted by a parachute to slow its descent and a giant system of airbags to cushion the impact.
Pathfinder not only accomplished its goal but also returned an unprecedented amount of data and outlived its primary design life.
Spirit and Opportunity
In January 2004, two robotic geologists named Spirit and Opportunity landed on opposite sides of the Red Planet. With far greater mobility than the 1997 Mars Pathfinder rover, these robotic explorers have trekked for miles across the Martian surface, conducting field geology and making atmospheric observations. Carrying identical, sophisticated sets of science instruments, both rovers have found evidence of ancient Martian environments where intermittently wet and habitable conditions existed.
Both missions exceeded their planned 90-day mission lifetimes by many years. Spirit lasted 20 times longer than its original design until its final communication to Earth on March 22, 2010. Opportunity continues to operate more than a decade after launch.
Mars Reconnaissance Orbiter
Our Mars Reconnaissance Orbiter left Earth in 2005 on a search for evidence that water persisted on the surface of Mars for a long period of time. While other Mars missions have shown that water flowed across the surface in Mars’ history, it remained a mystery whether water was ever around long enough to provide a habitat for life.
In addition to using the rover to study Mars, we’re using data and imagery from this mission to survey possible future human landing sites on the Red Planet.
Curiosity
The Curiosity rover is the largest and most capable rover ever sent to Mars. It launched November 26, 2011 and landed on Mars on Aug. 5, 2012. Curiosity set out to answer the question: Did Mars ever have the right environmental conditions to support small life forms called microbes?
Early in its mission, Curiosity’s scientific tools found chemical and mineral evidence of past habitable environments on Mars. It continues to explore the rock record from a time when Mars could have been home to microbial life.
Space Launch System Rocket
We’re currently building the world’s most powerful rocket, the Space Launch System (SLS). When completed, this rocket will enable astronauts to begin their journey to explore destinations far into the solar system, including Mars.
Orion Spacecraft
The Orion spacecraft will sit atop the Space Launch System rocket as it launches humans deeper into space than ever before. Orion will serve as the exploration vehicle that will carry the crew to space, provide emergency abort capability, sustain the crew during the space travel and provide safe re-entry from deep space return velocities.
Mars 2020
The Mars 2020 rover mission takes the next step in exploration of the Red Planet by not only seeking signs of habitable conditions in the ancient past, but also searching for signs of past microbial life itself.
The Mars 2020 rover introduces a drill that can collect core samples of the most promising rocks and soils and set them aside in a “cache” on the surface of Mars. The mission will also test a method for producing oxygen from the Martian atmosphere, identify other resources (such as subsurface water), improve landing techniques and characterize weather, dust and other potential environmental conditions that could affect future astronauts living and working on the Red Planet.
For decades, we’ve sent orbiters, landers and rovers, dramatically increasing our knowledge about the Red Planet and paving the way for future human explorers. Mars is the next tangible frontier for human exploration, and it’s an achievable goal. There are challenges to pioneering Mars, but we know they are solvable.
To discover more about Mars exploration, visit: https://www.nasa.gov/topics/journeytomars/index.html
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
Installation by David Bowen reproduces realtime wind data with a collection of mechanized stalks:
This installation consists of a series of 126 x/y tilting mechanical devices connected to thin dried plant stalks installed in a gallery and a dried plant stalk connected to an accelerometer installed outdoors. When the wind blows it causes the stalk outside to sway. The accelerometer detects this movement transmitting the motion to the grouping of devices in the gallery. Therefore the stalks in the gallery space move in real-time and in unison based on the movement of the wind outside.
May-September 2018 a newly expanded version of tele-present wind was installed at Azkuna Zentroa, Bilbao and the sensor was installed in an outdoor location adjacent to the Visualization and Digital Imaging Lab at the University of Minnesota. Thus the individual components of the installation in Spain moved in unison as they mimicked the direction and intensity of the wind halfway around the world. As it monitored and collected real-time data from this remote and distant location, the system relayed a physical representation of the dynamic and fluid environmental conditions.
More Here
Related: Another project by David from 2012 did something similar with ‘Tele-Present Water’ [Link]