An ongoing attempt to learn new technologies through simple, playful, and (sometimes) interactive sketches ♥ 

[_ you will find here vannila JS, p5.js, CSS animations, TouchDesigner, Unity, SparkAR, machine learning, and more_] 
 

As part of the development and user research of Marrow, the first technical director Cristobal Valenzuela (in collaboration with RunwayML) developed a public interface to play with the text-to-image concept.
We are thrilled to see the excitement this demo caused.
It seemed to give users an immediate route for a creative expression with a machine, which inspired interesting tests.

︎Marrow is an interactive theater experience about the possibility of mental states of the intelligent machines we create. More about the research
︎ Launch demo
Date: August 2018
Technology: AttnGAN.
Featured: Motherboard, Gizmodo
and was tweeted by Ian Goodfellow (GAN’s father)

In this web experiment, I used a generative design approach to create "Coronas" and implement a genetic algorithm from scratch using p5.js (inspired by a code challenge by Daniel Shiffman).
The Coronas evolve to find the "best" path to move around the "Capitalism" obstacle.


︎ Launch project
︎ View source code
Date: April 2020.
Technology: p5.js, js.





I’m playing with SparkAR for Instagram,
here is one I’ve created during Coronaviruse’s days.

Follow me on Instagram to use this one yourself!

Date: April 2020.
Technology: SparkAr


An experiment I did as part of Raycaster that combines Machine Learning network (AttanGAN and Coco Dataset) and Religion concept to generate quotes from the Bible.
such as:
“Then God said, Let there be light; and there was light.”

or:
“It is not good that the man should be alone; I will make him a helper as his partner.”

︎ Experiment #hashtag
Date: July 2018Developed together with Ziv Schneider and Eyal Gruss. Technology: taxt to image (attnGAN model).
Featured: Prosthetic Knowledge


I remember seeing a website made in p5 with an "attrection" feeling to it, so I set mysely to recreate it. Once you click, it converts the colors and red dots “follow” you. This one is part of my experiments in web front-end design.
︎ Launch project
︎ View source code
Date: May 2020.
Technology: p5.js, js.

In mental health day in May 2020, I created this experiment with a Space Colonization algorithm - That slowly takes over the “thoughts” in space and swallows them until they disappear entirely
(I still need to upload the code, for now here is a documentation).

Date: May 2020.
Technology: p5.js, js.



Learing to create kinetic typography based on different 3D geometry
Date: March 2020
Technology: TouchDesigner

An experiment in which I build an audio-visual instrument using Giphy and text to speech APIs.
︎ Launch website
︎ Github link

Date: February 2019
Platform: Web
Technology: React, Giphy API, Text to Speech.


Two different examples of reactive sound in TouchDesigner


learning some abstract particle displacement by using feedback, compositing, blur and displace in TouchDesigner.

Date: August 2019



Learning TOP concepts in TouchDesigner
Date: May 2019
 

In times, where our communication means changed and affected the way we connect with each other. I wanted to created this web mobile app that matches users for immediate human eye contact. This is a demo app I build to experiment with this idea, currently meant for two users at a time. so far only ran it on “User tests” :)

Date: April 2019
Platform: Web mobile
Technology: React on rails, Google maps API, Websocket, action-cables.

︎ GitHub link



draw-guess is a game I build for two, where one player gets a random word to draw in 30 sec, and the second user needs to guess what the word was based on the drawing.

Date: January 2019
Platform: Web
Technology:  Javascript, ,HTML, canvas HTML, CSS.

︎ Github link
︎ Launch website



Getting into genertive art in Unity- generting Fractales
Date:  October 2019


︎   ︎