Stencils

agm

awespme

control

killTV

obey

rebel

revolutionaction

rifter2

weapon

The Future is Now – Volume Two by Josan — Kickstarter

Ariel Nomad

Folding helmet inspired by the pangolin and honeybees

An unlikely pairing of a pangolin and honeybees inspired Golem Innovation to create the Alpha helmet, an articulating noggin protector that retracts around your neck when not in use.

The impetus behind the design of the Alpha helmet was a simple one: create a helmet that would be lighter than existing ones and could easily be put away so you didn’t have to hold it or run the risk of losing it.

The company’s design team settled on the pangolin’s ability to roll itself up into an ball as the primary design element, with the bee honeycomb as the influence behind the internal structure to give the helmet the ability to absorb hard impacts.

When the Alpha is not in use, it folds into itself and sits on your shoulders, taking up about as much space as the Hövding airbag for cyclists. To bring it into helmet mode, you reach around with one or both hands and the Alpha articulates into a half-faced helmet with built-in eye protection.

A double-lock system consisting of a mechanism on either side of the helmet must be engaged for the helmet to fold properly in either direction and stay locked in place. The helmet is also designed so that each of the sections or blades that comprise the foldable parts is overlaid in a way that adds to its overall stability and structural integrity. Additionally, the helmet incorporates a no-choke system that sees the chinstrap automatically break away if someone pulls the helmet from behind when it’s folded up and sitting on your shoulders.

Golem Innovation says it is initially targeting winter sports like skiing and snowboarding, and most bicycle sports. Eventually, it wants to expand to include just about any sport where wearing a helmet is either necessary or prudent.

The company recently launched a Kickstarter program to raise over 200,000 (US $224,300) to help bring the Alpha helmet to market. Early backers can pre-order the Alpha for 175 ($195) with shipments expected by June 2017 if the company is successful in taking this helmet from concept to completion.

Over the past few years, several different companies have introduced varied versions of foldable helmets like the Headkayse, Morpher, and Overade. However, none of these have the articulating design of the Alpha and they must be fully removed to take advantage of their folding capability.

Source: Golem Innovation

Source: Folding helmet inspired by the pangolin and honeybees

Cranium, intelligent helmet

Silent Supersonic Technology Demonstration Program- D-SEND#2 Test Results

12417877_996862273683408_1953541318782319940_n

Awesome Soviet Ekranoplan Aircraft Carrier Project

Interesting news from Russia in English language.

Source: Awesome Soviet Ekranoplan Aircraft Carrier Project | English Russia

Microsoft’s chat robot taken offline after internet users lead it astray

Microsoft said it was all the fault of some really mean people, who launched a “coordinated effort” to make the chatbot known as Tay “respond in inappropriate ways.” To which one artificial intelligence expert responded: Duh!Well, he didn’t really say that. But computer scientist Kris Hammond did say, “I can’t believe they didn’t see this coming.

“Microsoft said its researchers created Tay as an experiment to learn more about computers and human conversation. On its website, the company said the programme was targeted to an audience of 18 to 24-year-olds and was “designed to engage and entertain people where they connect with each other online through casual and playful conversation.

“In other words, the programme used a lot of slang and tried to provide humourous responses when people sent it messages and photos. The chatbot went live on Wednesday, and Microsoft invited the public to chat with Tay on Twitter and some other messaging services popular with teens and young adults.

“The more you chat with Tay the smarter she gets, so the experience can be more personalised for you,” the company said.But some users found Tay’s responses odd, and others found it wasn’t hard to nudge Tay into making offensive comments, apparently prompted by repeated questions or statements that contained offensive words. Soon, Tay was making sympathetic references to Hitler – and creating a furor on social media.

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” Microsoft said in a statement.

While the company didn’t elaborate, Hammond says it appears Microsoft made no effort to prepare Tay with appropriate responses to certain words or topics. Tay seems to be a version of “call and response” technology, added Hammond, who studies artificial intelligence at Northwestern University and also serves as chief scientist for Narrative Science, a company that develops computer programmes that turn data into narrative reports.

“Everyone keeps saying that Tay learned this or that it became racist,” Hammond said. “It didn’t.” The programme most likely reflected things it was told, probably more than once, by people who decided to see what would happen, he said.

The problem is that Microsoft turned Tay loose online, where many people consider it entertaining to stir things up – or worse. The company should have realised that people would try a variety of conversational gambits with Tay, said Caroline Sinders, an expert on “conversational analytics” who works on chat robots for another tech company. (She asked not to identify it because she wasn’t speaking in an official capacity.) She called Tay “an example of bad design.”Instead of building in some guidelines for how the programme would deal with controversial topics, Sinders added, it appears Tay was mostly left to learn from whatever it was told.

“This is a really good example of machine learning,” said Sinders. “It’s learning from input. That means it needs constant maintenance.

“Sinders said she hopes Microsoft will release the programme again, but only after “doing some work” on it first.

Microsoft said it’s “making adjustments” on Tay, but there was no word on when Tay might be back. Most of the messages on its Twitter account were deleted by Thursday afternoon.”c u soon humans need sleep now so many conversations today thx,” said the latest remaining post.

Source: Microsoft’s chat robot taken offline after internet users lead it astray | ONE News Now | TVNZ

Kara

Loom