Another experiment from LuluXXX exploring Machine Learning visual outputs, combining black and white image colourization, slic superpixels image segmentation and style2paint manga colourization on footage of Aya-Bambi dancing:
mixing slic superpixels and style2paint. starts with version2 than version4 plus a little breakdown at the end and all 4 versions of the algorithm. music : Spettro from NadjaLind mix - https://soundcloud.com/nadjalind/nadja-lind-pres-new-lucidflow-and-sofa-sessions-meowsic-mix original footage : https://www.youtube.com/watch?v=N2YhzBjiYUg
Link
Readings from «Scientific American» (ca. 1950): Computers and Computation, With Introductions by Robert R. Fenichel and Joseph Weizenbaum, W. H. Freeman and Company, San Francisco, 1971
Augmented reality sandbox - move the land around and it shows off the topography and sea/water level.
This dartboard from former NASA engineer Mark Rober guarantees a bull’s-eye through the power of software, motors, and motion tracking.
Artists Can Büyükberber and Marpi explore iOS ARKit for potential AR Art projects:
Weekend AR Experiment in collaboration with Marpi, an invisible exhibition blurring the lines between what’s real and what is not. Recorded in real-time at the de Young Museum, San Francisco, CA. Running realtime from a custom app, using iPad or iPhone with the new IOS ARKit and Unity.
“This is not like playing the two at the same time. The machine and its software aren’t layering the sounds of a clavichord atop those of a Hammond. They’re producing entirely new sounds using the mathematical characteristics of the notes that emerge from the two.”
https://soundcloud.com/wired/nsynth-bass-flute
https://soundcloud.com/wired/nsynth-organ-bass
https://soundcloud.com/wired/nsynth-flute-organ
Read More: https://www.wired.com/2017/05/google-uses-ai-create-1000s-new-musical-instruments/
On Aug. 21, 2017, a solar eclipse will be visible in North America. Throughout the continent, the Moon will cover part – or all – of the Sun’s super-bright face for part of the day.
Since it’s never safe to look at the partially eclipsed or uneclipsed Sun, everyone who plans to watch the eclipse needs a plan to watch it safely. One of the easiest ways to watch an eclipse is solar viewing glasses – but there are a few things to check to make sure your glasses are safe:
Glasses should have an ISO 12312-2 certification
They should also have the manufacturer’s name and address, and you can check if the manufacturer has been verified by the American Astronomical Society
Make sure they have no scratches or damage
To use solar viewing glasses, make sure you put them on before looking up at the Sun, and look away before you remove them. Proper solar viewing glasses are extremely dark, and the landscape around you will be totally black when you put them on – all you should see is the Sun (and maybe some types of extremely bright lights if you have them nearby).
Never use solar viewing glasses while looking through a telescope, binoculars, camera viewfinder, or any other optical device. The concentrated solar rays will damage the filter and enter your eyes, causing serious injury. But you can use solar viewing glasses on top of your regular eyeglasses, if you use them!
If you don’t have solar viewing glasses, there are still ways to watch, like making your own pinhole projector. You can make a handheld box projector with just a few simple supplies – or simply hold any object with a small hole (like a piece of cardstock with a pinhole, or even a colander) above a piece of paper on the ground to project tiny images of the Sun.
Of course, you can also watch the entire eclipse online with us. Tune into nasa.gov/eclipselive starting at noon ET on Aug. 21!
For people in the path of totality, there will be a few brief moments when it is safe to look directly at the eclipse. Only once the Moon has completely covered the Sun and there is no light shining through is it safe to look at the eclipse. Make sure you put your eclipse glasses back on or return to indirect viewing before the first flash of sunlight appears around the Moon’s edge.
You can look up the length of the total eclipse in your area to help you set a time for the appropriate length of time. Remember – this only applies to people within the path of totality.
Everyone else will need to use eclipse glasses or indirect viewing throughout the entire eclipse!
Whether you’re an amateur photographer or a selfie master, try out these tips for photographing the eclipse.
#1 — Safety first: Make sure you have the required solar filter to protect your camera.
#2 — Any camera is a good camera, whether it’s a high-end DSLR or a camera phone – a good eye and vision for the image you want to create is most important.
#3 — Look up, down, and all around. As the Moon slips in front of the Sun, the landscape will be bathed in long shadows, creating eerie lighting across the landscape. Light filtering through the overlapping leaves of trees, which creates natural pinholes, will also project mini eclipse replicas on the ground. Everywhere you can point your camera can yield exceptional imagery, so be sure to compose some wide-angle photos that can capture your eclipse experience.
#4 — Practice: Be sure you know the capabilities of your camera before Eclipse Day. Most cameras, and even many camera phones, have adjustable exposures, which can help you darken or lighten your image during the tricky eclipse lighting. Make sure you know how to manually focus the camera for crisp shots.
#5 —Upload your eclipse images to NASA’s Eclipse Flickr Gallery and relive the eclipse through other peoples’ images.
Learn all about the Aug. 21 eclipse at eclipse2017.nasa.gov, and follow @NASASun on Twitter and NASA Sun Science on Facebook for more. Watch the eclipse through the eyes of NASA at nasa.gov/eclipselive starting at 12 PM ET on Aug. 21.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
I can not argue with that
Project from Google Creative Lab is an open source physical interface for their NSynth project, which generates news sounds using Machine Learning to understand them:
Building upon past research in this field, Magenta created NSynth (Neural Synthesizer). It’s a machine learning algorithm that uses a deep neural network to learn the characteristics of sounds, and then create a completely new sound based on these characteristics.
Rather than combining or blending the sounds, NSynth synthesizes an entirely new sound using the acoustic qualities of the original sounds—so you could get a sound that’s part flute and part sitar all at once.
Since the release of NSynth, Magenta have continued to experiment with different musical interfaces and tools to make the output of the NSynth algorithm more easily accessible and playable.
Using NSynth Super, musicians have the ability to explore more than 100,000 sounds generated with the NSynth algorithm.
More Here
Did you know the guy from Elysium really played Chappie?