Project by Anastasis Germanidis and Cristóbal Valenzuela is an online Neural Network semantic painting tool. Trained on visual road data, you can doodle or place preset markers which will be visually translated into another image:
Uncanny Road is an experimental tool for collectively synthesizing a never-ending road using Generative Adversarial Neural Networks. It is based on the pix2pixHD project, published by @nvidia and UC Berkeley (Project, Paper and Code), that allows for photorealistic image-to-image translation. The pix2pix model was trained using adversarial learning on the Cityscapes dataset, containing thousands of street images.
To synthesize street images, draw on the colormap of the scene. Each color represents a different kind of object label (e.g. road, building, vegetation, etc.) that the neural network can understand
Try it out for yourself here
Visual experiment from LuluXXX tranforms a music video using lllyasviel’s code designed to colorize black and white manga:
messing around with style2paint : https://github.com/lllyasviel/style2paints a lot of entropy going on. original video : https://www.youtube.com/watch?v=gGjJFRPtdDI Music performed by Twin Peaks cast member Chrysta Bell and David Lynch, written by David Lynch and Dean Hurley.
An online version of Style2paint can be found here
An eclipse occurs when the Moon temporarily blocks the light from the Sun. Within the narrow, 60- to 70-mile-wide band stretching from Oregon to South Carolina called the path of totality, the Moon completely blocked out the Sun’s face; elsewhere in North America, the Moon covered only a part of the star, leaving a crescent-shaped Sun visible in the sky.
During this exciting event, we were collecting your images and reactions online.
This composite image, made from 4 frames, shows the International Space Station, with a crew of six onboard, as it transits the Sun at roughly five miles per second during a partial solar eclipse from, Northern Cascades National Park in Washington. Onboard as part of Expedition 52 are: NASA astronauts Peggy Whitson, Jack Fischer, and Randy Bresnik; Russian cosmonauts Fyodor Yurchikhin and Sergey Ryazanskiy; and ESA (European Space Agency) astronaut Paolo Nespoli.
Credit: NASA/Bill Ingalls
The Bailey’s Beads effect is seen as the moon makes its final move over the sun during the total solar eclipse on Monday, August 21, 2017 above Madras, Oregon.
Credit: NASA/Aubrey Gemignani
This image from one of our Twitter followers shows the eclipse through tree leaves as crescent shaped shadows from Seattle, WA.
Credit: Logan Johnson
“The eclipse in the palm of my hand”. The eclipse is seen here through an indirect method, known as a pinhole projector, by one of our followers on social media from Arlington, TX.
Credit: Mark Schnyder
Through the lens on a pair of solar filter glasses, a social media follower captures the partial eclipse from Norridgewock, ME.
Credit: Mikayla Chase
While most of us watched the eclipse from Earth, six humans had the opportunity to view the event from 250 miles above on the International Space Station. European Space Agency (ESA) astronaut Paolo Nespoli captured this image of the Moon’s shadow crossing America.
Credit: Paolo Nespoli
This composite image shows the progression of a partial solar eclipse over Ross Lake, in Northern Cascades National Park, Washington. The beautiful series of the partially eclipsed sun shows the full spectrum of the event.
Credit: NASA/Bill Ingalls
In this video captured at 1,500 frames per second with a high-speed camera, the International Space Station, with a crew of six onboard, is seen in silhouette as it transits the sun at roughly five miles per second during a partial solar eclipse, Monday, Aug. 21, 2017 near Banner, Wyoming.
Credit: NASA/Joel Kowsky
To see more images from our NASA photographers, visit: https://www.flickr.com/photos/nasahqphoto/albums/72157685363271303
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
Bitsquare, decentralised #bitcoin exchange
Had an interesting chat with an AI today
ARKit proof-of-concept demo from Trixi Studios applies an Augmented Reality portal with a ‘Take On Me’ music video drawing filter effect through an iOS device camera:
Link
Research from Carnegie Mellon Textiles Lab have put forward a framework to turn 3D model file into a physical knitted object:
We present the first computational approach that can transform 3D meshes, created by traditional modeling programs, directly into instructions for a computer-controlled knitting machine. Knitting machines are able to robustly and repeatably form knitted 3D surfaces from yarn, but have many constraints on what they can fabricate. Given user-defined starting and ending points on an input mesh, our system incrementally builds a helix-free, quad-dominant mesh with uniform edge lengths, runs a tracing procedure over this mesh to generate a knitting path, and schedules the knitting instructions for this path in a way that is compatible with machine constraints. We demonstrate our approach on a wide range of 3D meshes.
More Here
Japan just sent the Int-Ball, a photo and video drone, to the International Space Station. Its mission is to document the astronauts. Previously, astronauts spent 10% of their time doing photo and video documentation. Int-Ball’s footage can be seen in real time.
follow @the-future-now
VR Artist Anna Zhilyaeva shares her first creation made with Tiltbrush in 2018:
This is my first Tilt Brush painting of 2018. I tried to make her look good from every angle while keeping the painting style.
You can view Anna’s work on Google Poly here
Link
When I was a freshman, studying music, I built my first computer program… and I didn’t even know I was coding:
At the time, I was learning to analyze chords by identifying the individual notes, reordering them into “thirds”, and comparing this stack to the actual arrangement to determine the inversion. I didn’t know anything about programming at the time, but my roommate was an engineer who showed me Wolfram Alpha’s Mathematica, a coding environment useful to a number of fields.
Well, I was just as “screw the rules” then, so I learned just enough to build a sort of decision tree to do my chord analysis homework for me. Above, nested If[] statements determine the interval by calculating the distance between pitches (in half-steps). Below, a similar set-up figures out the inversion of a chord.
There are a bunch of similarities to the JavaScript world I generally live in these days. It looks like Mathematica uses [] brackets instead of () parentheses and {} squiggly brackets, and presents its arguments more like an Excel function, but all the math-y bits certainly work the same… except… I wish Javascript let you string inequalities together like that!
One interesting peculiarity here - I have multiple functions with the same name. Whereas JavaScript functions don’t much care how many inputs you actually feed them, it seems I have different versions of the same keychordtype[] function for different numbers of inputs (defined here with a trailing _ underscore).
And instead of the console.log() message or the alert() pop-ups, outputs are made visible with the MessageDialogue[] function. So even though I don’t have any comments, and my nesting, naming, and order are a bit sloppy (look at those closing brackets! ridiculous!), I can still understand what’s going on - 10 years and several languages later.
tl;dr: music theory is math; different languages have different syntax, but logic is logic; Mathematica has a 2-week trial I’m eating though to take these screenshots
project: chord analysis homework helper
Built by the Standard Motor Company at the order of Lord Beaverbrook, the then Minister of Aircraft Production, the Car Armoured Light Standard or ‘Beaverette’ was intended as a stopgap measure to help with airfield and local defence. Essentially built onto the chassis of Standard’s pre-war commercial models it had thin riveted steel armour, backed by 3-inch oak reinforcement planks, to its front and sides, the MkII added all around armour but remained open topped.
The Beaverette was typically armed with either a Bren light machine gun or a Boys anti-tank rifle and was crewed by three men - driver, observer, gunner. Weighing two tonnes the vehicle could reputedly reach up to 60mph, however, given its weight this seems optimistic. The weight of the armour was said to quickly fatigue the chassis and suspension.
Side view of some Beaverette MkII’s (source)
They were never cleared for foreign service and spent the war helping to train crews and patrolling the British Isles with regular army units (including the The Reconnaissance Corps and Royal Armoured Corp), the RAF Regiment and the Home Guard. Principally, the Beaverette was a stopgap measure, similar to the Armadillo, to supplement Universal Carriers when following the evacuation of Dunkirk the British Army was desperately in need of vehicles.
The Beaverette provided a propaganda boost with many photographs and newsreels filmed of them in action patrolling, this helped to boost civilian morale and prove Britain was not defenceless and that British ingenuity would help turn the tide. Later MkIII and MkIV Beaverette’s added roofs and turrets but remained in limited defensive service.
Sources:
Images: 1 2 3 4 5
Standard Beaverette, Tank Encyclopedia, (source)
If you enjoy the content please consider supporting Historical Firearms through Patreon!