K-2SO will be there for you in augmented reality. Visit www.starwars.com/k2andme to find out how.
IBM is building its blockchain work over a growing number of locations and employees, and Marie Wieck ties it all together. from CoinDesk http://ift.tt/2xbXrkC Donate Bitcoins 191LaSo6DsQFFMr9NQjyHBeYKLogfEYkBa
Project from Fernando Ramallo is a drawing and animation tool for Unity with simple interfaces to create assets for games and interactive experiences, a bit like Flash but in 2.5D:
DOODLE STUDIO 95 is a FUN drawing and animation tool for Unity.
Doodle an animation without leaving the Editor and turn your drawings into sprites, UI elements, particles or textures, with a single click.
Draw inside the Unity Editor
Easy presets for backgrounds, characters and UI elements
Example scenes with 2.5D characters, foliage, speech bubbles and transitions, with reusable scripts
Draw and animate inside the Scene View (beta)
Shadow-casting shaders
Don’t think about materials or image formats, it Just Works.
Five Symmetry modes
Record mode adds frames as you draw
Record a sound with a single click! Boop!
Easy API for using animations with scripts
Convert to sprite sheets or GIFs
…and more
You can find out more here, and even try out a browser-based interactive tour here
This house is being 3-D printed with human and robot construction. Mesh mould technology uses the precision of robot building capacities to eliminate waste.
follow @the-future-now
Did you know the guy from Elysium really played Chappie?
Sunflowers or Gears Turning Stimboard for anon
Sources: (x) (x) (x) (x) (x) (x) (x) (x) (x)
Project from Universal Everything is a series of films exploring human-machine collaboration, here presenting performative dance with human and abstracted forms:
Hype Cycle is a series of futurist films exploring human-machine collaboration through performance and emerging technologies.
Machine Learning is the second set of films in the Hype Cycle series. It builds on the studio’s past experiments with motion studies, and asks: when will machines achieve human agility?
Set in a spacious, well-worn dance studio, a dancer teaches a series of robots how to move. As the robots’ abilities develop from shaky mimicry to composed mastery, a physical dialogue emerges between man and machine – mimicking, balancing, challenging, competing, outmanoeuvring.
Can the robot keep up with the dancer? At what point does the robot outperform the dancer? Would a robot ever perform just for pleasure? Does giving a machine a name give it a soul?
These human-machine interactions from Universal Everything are inspired by the Hype Cycle trend graphs produced by Gartner Research, a valiant attempt to predict future expectations and disillusionments as new technologies come to market.
More Here
Coding experiment from Kyle McDonald arranging samples for music production in a unique way using machine learning:
I’ve been thinking about new ways of making music and working with sound. I’m especially excited about machine learning augmenting our selection of sounds, analyzing and decomposing existing recordings, and making automatic suggestions for compositions.
This shows around 30k “drum samples” from a few different sample packs, organized in 2d (position) and 3d (color). All sounds are less than 4 seconds long, but I only analyze and play the first second while scrolling through. I used librosa to extract the constant-q transform of each sound with 84 bins and 11 time steps. I used t-SNE with perplexity 100 to layout the sounds from those 924 dimensional vectors.
Link
🌿by javi.eats.and.runs on insta
1995 Video of Virtual IO’s I-Glasses. Virtual reality Head Mounted Display