> what started as me meeting a stranger-soon-friend at the airport w/ @ThomasStubblefield and asking him why he was wearing fake prescription glasses to make himself look smarter turned into the most fire hackathon project ever.
this weekend, @ArnavChauhan , @TCYTseven , and i worked together to think of the most useless project we could built in ~24 hours + with the ray-ban meta ai glasses (yes, those glasses).
introducing (brace yourselves, and give me credit for the name) our final project, "POKER? I HARDLY KNOW HER!" ♦️ ♣️
you might've come across this every-day situation many times in your life: you have homework assignments due tommorow but you ALSO want to play poker
. just wear our glasses and 1) have them analyze your hand, the cards on the table, and the emotions of the poker faces in the room around you to 2) calculate a probabilty
of you winning the game and then 3) answer the next question on your homework but with an accuracy rate inversly proportional to your odds of winning poker match. using computer vision, opencv, roboflow, openai
, elevenlabs, & opensource python
libraries we've made this a reality for you. watch our <https://youtu.be/JVtFxCJw5ng|demo> to get the full experience!
scrappy moment: turns out meta
provides no easy way to access the camera feed from the glasses for analysis. however (!!), they do allow access to the camera feed in their own services (yes, we ended up making facebook accounts for this initially 💀). the live video used on this project is taken from a whatsapp video call between two devices + obs
video capture.
github repo:github.com/sahitid/meta-vision-project
🕶 📼 full demo:youtu.be/JVtFxCJw5ng*tl;dr:* we made a meta glasses-powered system that lets you play poker and do homework at the same time—but the better you do at one, the worse you perform at the other.*super special thanks* to @ShubhamPatil @kevinjosethomas @sarthak @Atulya-U04FJLBJ72S @JesseCogburn (jimbooo) @Mohamad @Rhys-U04GECG3H8W for the hours of moral support + starring in our demo + teaching us about your love for horiculture + & of course "chris"
@magicfrog0this is the first PCB i've ever made! introducing... ✨ magic frog's lightsaber ✨!! i designed it originally for #C06RQ9TTEG3|... but it took a lot longer than expected lol. it has a mpu6050 gyroscope & accelerometer and ardunio nano. when you move it faster, more of the neopixel LEDs light up and each axis (x, y, z) is assigned to red, blue or green.
i first designed the entire board not planning on using the ardunio nano or mpu6050, instead components built into the board but there were a tonnn of problems (one of which was burning the bootloader but then again got to use oscilloscope to see the data lines which was sooo cool) so i decided to change plans. the board has some fun lighthouse-themed silkscreens bc i was first planning on 3d-printing a case and making a lighthouse desk lamp.
also big big thanks to @ThomasStubblefield and @ky200617 for helping me out and encouraging me along the way! :)
schematic & kicad design -> github.com/themagicfrog/lightsaber
finally, here are some fun long-exposure light photos i took and pictures of the board!
Photoshopped the posters to make more sense for the lols
@KarthickArun0Had nothing else to hang this awesome scroll from the summit so I just put it on this holder for my whiteboard and proped it on two whiteboard markers I had laying around #the-summit @ThomasStubblefield
@ImDeet-U045B4BQ2T00me and @ThomasStubblefield again, this weekend we built a couple things watch video below@SophiaPung0We’re finally able to share the high level schematic for the board (we have many more lower levels to the schematic hierarchy)!! A huge shoutout to our team @jc@HenryBass-U02KEJ8T6D8@Cheru, and @NilaRam for ALL the hard work we put in this week!!
🔥
Also a huge shoutout to everyone who helped us, fed us, and housed us (@ThomasStubblefield + @ImDeet-U045B4BQ2T0) this week. Thank you to @KaraMassie and @msw for coordinating with us, and helping us through the logistics for this project. Thanks @NickyCase, @malted, @kognise, @belle, @karina) for hanging out with us this week, and keeping our spirits high. Next sprint we plan to write a programmable multilayer convolution layer NN using wavelet transforms. Stay tuned for our progress next sprint (December 26th-1st), and come to our demo day on New Year’s day if you’re around (or tune in via zoom). Thanks everyone who joined and participated in this channel, and follow our Twitter so see the memes we post from the week (OpenAI board memes, touching grass, team sink + more😁). 3-2-1 MMI!!!!!!!
@jaspermayone0Ramen Dinner with some HQ folks! Had so much fun, thanks @ThomasStubblefield for the invite, and Deeter, Fayd, jc, Malted and Nila for the very entertaining dinner conversation lol 🍜