jb
Previously Engine/Rendering Tech @idSoftware
interested in: C++/OpenGL/Vulkan/GPUs/Voxels/SDFs/Pathtracing/Photography/CpE/Electronics
Co-organizer of Graphics Programming Virtual Meetup
Project writeups, blog posts:
jbaker.graphics/index.html
- any further updates, you can find on my website jbaker.graphics/writings/ind...
- Reposted by jbHi #portfolioday ! I'm Megan Llewellyn and I'm a freelance medical illustrator available for new work! I specialize in highly detailed surgical scenes, and I have a lot of experience in drawing pathology, anomalous anatomy, and novel procedures. ✉️ m.rose.llewellyn@gmail.com 🫀 meganllewellyn.com
- Reposted by jbthe 2nd part of my dithering visual article is finally out! 🔗 visualrambling.space/dithering-pa... this one mainly explores the threshold map and how it generates those unique visual patterns hope you enjoy this as much as I enjoyed making it! made with #threejs & #animejs
- Reposted by jbAnd one of my very first works in the PSX low-poly style. As for me, this is the simplest direction of 3D model design, but also the most interesting) skfb.ly/oT6Wp
- Reposted by jbAssuming uniform weights and frequencies, and random colors and anisotropy directions, this is what an example gaboronoi diagram would look like.
- vkCmdBlitImage2 is an extremely cool feature - if there is not a 1:1 pixel correspondence between src and dst, it actually uses the texture hardware to do a copy. Really interesting mix of high and low level feature, would typically have to write a little shader to do this.
- Reposted by jbSpeechless, jaw-dropping displays tonight. Fairbanks, Alaska. iPhone photos.
- Reposted by jb
- Reposted by jbEver seen one of these, and wondered *why* they work? 🧵 This is a so-called Bitonic sorting network, and the illustrations show the two common ways to implement them. From an implementation point of view the two methods are effectively the same, so which one you pick just comes down to preference.
- converted 3D physarum from DDA traversal to delta tracking and seeing a roughly 2x speedup in the worst case when there is a lot of scattering
- Reposted by jbA sneak peak of what I've been working on: A strand based hair system for Jolt Physics (WIP): youtu.be/Rl19G0c003o
- Reposted by jbGood idea using gaussians for tiled floor vegetation: yunfan.zone/gswt_webpage/ #3dgs #gamedev #rendering
- Reposted by jbTime for more Menu Cube posting 😌
- Reposted by jbmagazine screenshot (1993) archive.org/details/BYTE...
- Reposted by jbComputer Animation Celebration VHS : 587s
- Reposted by jbWith my rtgi i am also doing something, that i believe, is a novel technique that i have not seen done before. While probes are often used in rtgi for indirect diffuse on hit shading, i use them to replace all rtgi hit shading. On each hit i project the closest probes radiance into the hits.
- Reposted by jbMarvelling the red brick reflections in the leather glove material from the sun hitting the red brick house next to me.
- really digging this for a light config UI
- fun to just play around and look at how the different distributions effect the color card
- Reposted by jb5 months & 22 day exposure from Fort Point looking across the harbor to South Station Tower #solargraphy #pinholephotography
- this is something I came across in the xRite literature. The qualitative saturation difference actually corresponds to the emission between the two illuminants - xenon has more wideband activity, and so it is "less saturated".
- Reposted by jb
- picked up a swatchbook of filters corresponding to the gel data I found. cool to see the correspondence... saturation differences, I think here is because of the CRI of the LED relative to the xenon curve I have
- setup with Wenzel Jakob's "Low-Dimensional..Spectral Upsampling" to go from xRite color checker sRGB constants to spectral reflectance... then convolving reflectance, emission spectra, and wavelength color to get an approximation of the color card under that illumination
- Reposted by jbfav recent science fact: did you know that water strongly absorbs almost all wavelengths of light, EXCEPT for a tiny window right around the visible spectrum? this (+ UV radiation causing damage to DNA) is probably why the visible spectrum is visible to us! 🧪
- working on some UI preview for spectral distributions
- what's cool is that with the use of gel filter data, I can actually quickly see the normalized filtered spectrum of the given distribution... this can be used to generate the iCDF to importance sample this particular light's spectral power distribution. there can be more than one filter in the stack
- Reposted by jbThe video: youtu.be/il-TXbn5iMA
- Reposted by jbI made a spectrograph to see if I can test my #astrophotography filters. How good is it? Hint: there are probably tweaks I can do to make it better. Video out now. youtu.be/6EeD9ejpDzY
- Reposted by jb
- Reposted by jbThis reflective acrylic sheet shows how 1980s illustrators were able to trace real-life objects on the computer. Once it was aligned, it reflected whatever was in front of it over the monitor's display, allowing artists to trace it in a tool like MacPaint or MS Paint. Simple, low tech and effective.
- Reposted by jbOkinawa Prefectural Museum & Art Museum (Okimu) in Naha, Okinawa, Japan by Ishimoto and Niki Architects (1975) r/brutalism
- Reposted by jb
- Reposted by jbThis is true and I was the only one on set excited about it (and playing the song on my phone to blank stares)
- Reposted by jbBack cover of Design (Japan), 097, 1967. designreviewed.com/artefacts/de... #graphicdesign
- Reposted by jbWall textures, Hampi, India 2025 Pentax 645n, 45-85 4.5, Tmax 100 and Kodak Gold #BelieveInFilm #India
- Reposted by jb🚨 New blog post! 🚨 If you want to learn about: 🎨 Monochrome colour palettes 📊 Designing better black & white visualisations 🛠️ Rethinking single-colour chart design Read this ➡️ nrennie.rbind.io/blog/monochr... #RStats #DataViz #ggplot2 #RLadies
- it's interesting, some of the multiple-refraction interactions start looking like iridescence
- Reposted by jbvolumetric shadow mapping from a single camera render. computing the volume shadows is O(log(voxelResolution.z)) complexity