![]() ![]() ![]() □ Measure: Measure dimensions and spacing in pixels.□ Viewports: Cycle through preset viewport dimensions (also customizable).□ Zoom: Zoom in and out to inspect each pixel.□ Remount: Remount the currently selected story.7.0 reorganizes the tools to reduce how much you have to move the mouse (or tab for keyboard users). Storybook comes with a set of tools that help you debug UI appearance and layout. All of this while preserving the information density you’d expect from a developer tool. You’ll find a new, more subtle menu along with spacing refinements to increase the search bar’s clickable area. Every pixel dedicated to development makes Storybook more convenient to use. That gives you more room to build and document your UI. Layout updatesħ.0 alpha expands Canvas size edge-to-edge. Sign up to the mailing list to get early access to these features. Rest assured, we’re updating the design in parallel with work on performance, stability, and bundle size that befits a major version bump. 7.0 revamps Storybook’s ergonomics to integrate these workflows into one seamless experience. Storybook is now used for UI development along with testing and documentation. In the last 3 years, our features expanded to meet the needs of a growing community. ![]() From global companies like Shopify and Microsoft to government services in the Netherlands, UK, and Italy. Thousands of teams use Storybook to ship UIs around the world. We believe small ergonomic improvements add up to big productivity boosts over time. I’m thrilled to announce that these updates are now available in alpha (instructions below). Last month, I shared a sneak peek of Storybook 7.0’s design that streamlines core UI patterns that devs use every day. Despite giant leaps in functionality over the last few years, the core user experience hasn’t changed since version 5.0. Storybook supports every major view layer, countless workflows, and legions of frontend developers. Let leftThumbTip, rightThumbTip, leftIndexTip, rightIndexTip, leftIndexFingerDip, rightIndexFingerDip, rightMiddleFingerDip, rightRingFingerDip, rightMiddleFingerTip, leftMiddleFingerTip, leftMiddleFingerDip, leftRingFingerTip, leftRingFingerDip, rightRingFingerTip if ( hands & hands. Without going into too much details, here's a code sample for the "zoom" gesture: In my separate web app, I am running TensorFlow.js and the hand pose detection model to get the coordinates of my hands and fingers on the screen and create some custom gestures. Here's a quick visualization of the architecture: The way I went about it is using Socket.io to run a separate web app that handles the hand detection and send specific events to my Figma plugin via websockets. Then, even though you have access to some Web APIs in a plugin, access to the camera and microphone isn't allowed, for security reasons, so I had to figure out how to send the hand data to the plugin. The first thing to know is that you can't test your plugins in the web version so you need to install the Desktop version while you're developing. I had never made a Figma plugin before but decided to look into it to see if I could build one to design UIs using hand movements. Since the release of the latest version of the MediaPipe handpose detection machine learning model that allows the detection of multiple hands, I've had in mind to try to use it to create UIs, and here's the result of a quick prototype built in a few hours!īefore starting this, I also came across 2 projects mixing TensorFlow.js and Figma, one by Anthony DiSpezio to turn gestures into emojis and one by Siddharth Ahuja to move Figma's canvas with hand gestures.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |