[HN Gopher] Figma AI
___________________________________________________________________
Figma AI
Author : jordansinger
Score : 178 points
Date : 2024-06-26 17:39 UTC (5 hours ago)
(HTM) web link (www.figma.com)
(TXT) w3m dump (www.figma.com)
| jacobp100 wrote:
| It looks like a solid set of updates. A lot of designers seem
| worried, but I'm not sure they need to be. It was always the case
| you could get prebuilt templates and the like, but companies
| still prefer to have professional designers. I imagine it'll make
| prototyping much faster
| karaterobot wrote:
| Watching the keynote, the feature where it spits out a design
| based on a prompt seems like a gimmick. If it doesn't work with
| MY design system (as opposed to curated design systems like MUI
| or Bootstrap or whatever) then it's useless. It's also probably
| not ready to design complex applications, as opposed to brochure
| websites and apps that fit into very common templates.
|
| The AI-powered search features seem really promising, as finding
| the right design file is a problem I face every day. What they
| showed was the ability to paste a screenshot (e.g. from
| production) and have it find where that design came from, or use
| a text search to do the same. That's something that'll make me
| squeal with joy if it actually works.
|
| Filling in mock data in designs is also a big potential quality
| of life improvement. The domain I work in is so specific that I'm
| not sure it'll be of practical use to me, but I'm hopeful.
|
| All in all, I'm glad about the parts where they are trying to
| target some specific pain points with AI, skeptical about the
| rest.
| jjcm wrote:
| PM here for Make Designs.
|
| > If it doesn't work with MY design system (as opposed to
| curated design systems like MUI or Bootstrap or whatever) then
| it's useless
|
| You hit the nail on the head. This our #1 focus for this team
| moving forward. If you noticed in the keynote, Dylan
| highlighted Chen Chen and the Google Material team, both of
| whom created generators for their design systems. They work,
| but they were hard to make as the tooling to create these isn't
| polished or ready yet. It's something we're actively thinking
| about and improving in order to make this useful for those with
| existing design systems.
| tsunamifury wrote:
| The problem here isn't that it uses your pattern library or
| not. The issue is it will spit out static designs that look
| right but aren't actually based on either user goals or
| business goals of the product. It's just a design that looks
| right. It's missing the entire workflow design part and in
| fact covers over the need for it.
|
| I see it leading a lot of less experienced product programs
| astray.
| choppaface wrote:
| Are you really shooting to generate something full-fidelity?
| Or rather provide lower fidelity but empower the user to
| spitball and maybe test light interaction?
|
| And if it is full fidelity, how do you ensure users actually
| own the copyright at the end of the day?
| jerrygenser wrote:
| v0 by Vercel only outputs one UI library, shadcn/ui. Useful if
| you are wanting to use that library for example starting a
| project from scratch. Potentially vercel will add more.
|
| https://v0.dev/
| leerob wrote:
| You can also output HTML / Tailwind, but the ambition is to
| support custom React component libraries & design systems.
| kristopolous wrote:
| most projects are pretty generic. I've been on countless where
| they just want something that looks like everything else. If
| this can deliver that and I can do it without having to
| contract out a designer, it's a win.
| hbosch wrote:
| The race is on. What will AI deliver first...? Production-
| ready code so PMs don't need engineers, or production-ready
| designs so PMs don't need designers?
|
| The reality is that without domain experts (design, research,
| engineering, business, law) you're left trusting the AI to
| make the decision without a good way to check if it made the
| right one.
| alfalfasprout wrote:
| Oh you'll find out alright... when it's too late. I'm just
| waiting for a year or two from now when companies realize
| what a colossal hole they dug themselves in using a bunch
| of AI generated crap in production.
|
| We're already seeing it with marketing copy.
| kristopolous wrote:
| Design is different. It's become literally just whatever
| the boss likes.
|
| You can tell me about the math, theories and laws but look
| around you, nobody cares about those any more.
| Hoasi wrote:
| > You can tell me about the math, theories and laws but
| look around you, nobody cares about those any more.
|
| It's true. Just a few years back, design thinking was all
| the rage. Nowadays, even apps from flagship companies
| feel all over the place. Attention to detail, unobtrusive
| interface, intelligent symbols, subdued color palettes,
| rich illustrations, all that is out the window now. Why
| should we need good design when we can replace it with
| cheaper "AI" templates?
| hn_throwaway_99 wrote:
| > The AI-powered search features seem really promising, as
| finding the right design file is a problem I face every day.
| What they showed was the ability to paste a screenshot (e.g.
| from production) and have it find where that design came from,
| or use a text search to do the same. That's something that'll
| make me squeal with joy if it actually works.
|
| I've found that, generally, "better search" is the biggest
| productivity bump I get from gen AI at the moment. That is,
| lots of times I don't know the keyword of what I want to find
| beforehand, and with things like ChatGPT I can just describe it
| to find out. E.g. recently I needed to update all the values in
| a Postgres array column of one value to another, so I just
| described to ChatGPT what I wanted to do. It would have taken
| me a lot longer to hunt through the docs to find the
| `array_replace` function and get usage examples.
|
| Plus, in the "better search" use case I'm much less concerned
| about hallucinations as I'm really only using ChatGPT to get me
| started in the right direction in the first place.
| iknowSFR wrote:
| Every AI tool being released by 3rd parties is being blacklisted
| by my company. The primary reasoning is around sharing data with
| tool itself. On the other hand, if my clients allow them, there's
| a long review and approval process followed by continuous audits.
|
| Is anyone else running into this at their companies? I look at
| these tools and get excited but I'm continuously being blocked
| from using for work.
| digging wrote:
| My organization is kind of just not talking about AI tools and
| the obvious risks. We've gotten neither a ban nor an all-clear,
| leaving us to use it according to our own judgment and not talk
| about it very much. I think more companies are in this boat,
| and while I understand your frustration, your company is making
| the wiser move (although it may bite them; some of the more
| reckless companies will die for their carelessness but others
| will survive and pull ahead faster).
| YetAnotherNick wrote:
| I saw this in a few of my friends' companies. I have asked to
| everyone how is storing data in Azure is better than using
| Azure ChatGPT API, and till now I haven't been able to get a
| good answer.
| mrbungie wrote:
| This. If you don't trust your provider when it says it won't
| use data you embed in your LLM API Calls, why do you trust
| them when you use any of their other services?
| shermantanktop wrote:
| Because those other services are built with clear
| expectations of tenant isolation, and cross-tenant data
| leakage would be a near-fatal event.
|
| But the models behind these AI tools have a single-tenant
| core, with tenant isolation added on as a heroic effort to
| fake what the technology does not support by default.
| mvdtnz wrote:
| That's like asking why is it ok the use AWS to host my
| containers but it's not ok to leave company files in some
| other guy's S3 bucket.
| sebzim4500 wrote:
| No it isn't, the Azure ChatGPT API is operated by Azure,
| not OpenAI.
| TeMPOraL wrote:
| Still, it makes sense, as if your company is serious
| about data protection in the first place, it isn't just
| "using Azure", but has a contract with guarantees.
| Guarantees that don't carry over to randos using Azure
| just because it's Azure.
| constantcrying wrote:
| The answer is likely in the contract your company has with
| Microsoft.
| burningChrome wrote:
| My very large corporation is 100% all in on Figma, but has
| recently blocked several tools and have signaled (through
| meetings and memo's) that their AI tools will also be blocked.
|
| As an aside, I work in accessibility and our team started using
| some AI and automated tools to do some A11Y testing and much of
| the functionality has been blocked. We were really interested
| in the companies mobile testing tool but now we won't be able
| to use that because of the network restrictions.
|
| I've also heard complaints from Devs who said GitHub Copilot
| has also been blocked and they've been in discussions with the
| security team to try and get approved, but no luck thus far,
| they're not budging on allowing it yet.
|
| So you're right, its not just one app or tool, companies are
| staring to block this stuff wholesale across their orgs.
| johnfn wrote:
| I don't get this. Obviously Figma already stored all your
| designs in their cloud. What's the additional problem with also
| getting AI suggestions back from Figma?
| illumanaughty wrote:
| The problem is when other people are getting 'AI suggestions'
| based on your work. What if you don't own the copyright for
| the work you're producing (it's for a client)? What if you
| don't what AI being trained on your data?
| taylorlapeyre wrote:
| By default, Figma will not using organizations' data to
| train.
| creativenolo wrote:
| But only for organisation and enterprise plans!
|
| > Starting today, admins can set that content data
| training preference directly in settings, across all
| plans.
|
| > -Starter and Professional plans are opted in by
| default, but can opt out.
|
| > - Organization and Enterprise plans are opted out by
| default.
| hn_throwaway_99 wrote:
| Figma is explicit about this:
|
| > Two important highlights: First, all admins have control
| of whether their team's content data is used for training.
| Second, participation in AI content training is not
| required to use Figma or Figma's AI features. Learn more
| about our approach to training.
|
| Blanket bans on third party AI tools don't make any sense
| to me. As the parent commenter said, they already have all
| your data, so you already have to trust them with that. Why
| would you trust them in those other areas but not trust
| their explicit statements that say that you can disable
| training on your data?
| moffkalast wrote:
| If Figma takes a little inspiration from Adobe (they were
| almost acquired at one point after all) they'll realize
| they can change their TOS at any time for any reason with
| impunity. Such statements are at best dubious and at
| worst complete unprovable bullshit.
|
| The only way to be sure your data stays yours is local
| models running on your machines.
| fnordpiglet wrote:
| This applies to all non self hosted software and even
| much self hosted software that radios out invasive
| telemetry. The difference between offering an AI feature
| or not offering it is time, but collecting your data as
| training for that eventually inevitable AI feature is
| happening now, everywhere, all the time. The only defense
| is retreat into a SKIF or depend on ToS, and as you point
| out ToS is a very weak defense. Things are moving
| considerably faster than law, regulation, judicial
| review, and establishment of a compliance framework can
| possibly accommodate.
|
| Welcome to the future you were promised! It's already too
| late.
| sofixa wrote:
| > Blanket bans on third party AI tools don't make any
| sense to me
|
| It's an easy, safe and secure default.
| hn_throwaway_99 wrote:
| I agree with this, I think I was just interpreting "ban"
| differently. I.e. if the default is you can't use it, but
| then there is a specific request/review/audit process to
| allow it (as original GP's comment said), that makes
| sense. I just don't think it makes sense to ban without
| an exception process.
| 542458 wrote:
| I think the "easy" bit is increasingly not true, given
| that blanket AI feature bans mean you've banned both Mac
| OS and Windows.
| __loam wrote:
| Is that control opt-out by default?
| hn_throwaway_99 wrote:
| Another commenter mentioned that the default is no
| training for Enterprise and Organization accounts, but
| for Starter and Professional accounts the default allows
| training and you need to explicitly opt out.
| constantcrying wrote:
| What you have to do, to get around this is to negotiate a
| contract with the company providing the AI tools, which
| explicitly forbids them from doing that.
|
| This is how corporations solve these kinds of issues. As soon
| as you as a company are _a customer_ instead of some random
| person using their AI tools, you start to actually have
| influence over the other company.
| hn_throwaway_99 wrote:
| As mentioned elsewhere, you don't need some special contract.
| Figma allows anyone to opt out of training (though I think
| it's a fair point that only Enterprise and Organization plans
| are opted out by default, while Starter and Professional
| plans you have to explicitly select opt out).
|
| Nearly every SaaS tool I've seen that has AI features lets
| you opt out of training, as these SaaS companies know that
| this is a deal breaker for many of their clients.
| constantcrying wrote:
| Which I think is also unsurprising, it takes the insanity
| of Adobe to want to claim your customers work for you.
|
| But this is exactly the type of situation why you have a
| contract. Even if it is standard, this is the means by
| which you can enforce the AI company to not use your data.
| animex wrote:
| If Figma is using OpenAI API on the back-end for this, does the
| data they relay to OpenAI get protected? (Not sure if they do,
| but curious if their solution is on-prem or being handled by a
| 3rd party)
| alfalfasprout wrote:
| Yep lots of companies are (rightfully) being careful about 3rd
| party AI tools. You're sending valuable company assets or in
| some cases even PII (though probably not for Figma) to another
| party and you're basically hoping they aren't being careless.
|
| Total security (and possibly legal) nightmare.
| itronitron wrote:
| All that page demonstrates is that Figma facilitates a lot of
| design features that have no utility for the user.
| ProfessorLayton wrote:
| Not everything listed here, but things like layer naming
| assists, translation, and mockup auto-fill definitely help
| designers focus on what's important to the user.
|
| More time spent on user issues rather than monotonous work
| sounds like a win win.
| klabb3 wrote:
| Is it just me or did gen AI allow companies to cheat a lot more
| with their demos/showcases? I feel like there's absolutely no way
| to tell if what's shown is representative, cherry-picked or
| outright faked. I mean, it's non-reproducible by nature, so it
| even gives plausible deniability to unscrupulous marketing
| departments.
|
| I'd rather watch a YouTuber or streamer do a real project in a
| tool too see if and how it works in practice.
| jjcm wrote:
| PM here on the AI streams. These were all live demoes, not
| faked. Was not great for my anxiety given the non-deterministic
| nature.
|
| We did talk about faking it, but Dylan was heavily opposed as
| he felt it wouldn't be genuine.
| devmor wrote:
| I don't think it's that big of a change. I've sat on both sides
| of many meetings where designers showcased mockups of
| functionality that didn't exist and wasn't even sanity-checked
| for a possibility to be engineered in the first place.
| karaterobot wrote:
| If you watched the keynote, it was pretty clearly not cherry-
| picked or faked. It didn't come up with perfect images or text
| every time. It felt very much like what happens when you enter
| a prompt into a chat-based AI.
|
| As additional proof that it was not faked, the CEO was clearly
| distracted by the notifications from hundreds of people
| requesting access to the live file he was demoing from, which
| was a pretty good live demo moment.
| ChipperShredder wrote:
| I don't like Figma, why is there always stuff on the front page
| about Figma? Is Figma a YC property? That would make sense, I do
| see YC businesses heavily promoted here. Works for Berkshire
| Hathaway.
| dang wrote:
| Figma is not a YC company.
| Matticus_Rex wrote:
| Figma is often frontpage because it's one of the most-used
| tools on the market for tech companies, and it's therefore
| relevant to a lot of the people here.
| josefrichter wrote:
| Naming layers has been a problem for _decades_. There are even
| memes about it. With the advent of AI, we all knew this is a
| perfect match where AI can help from day one. Happy to see it
| arrive to Figma now.
| photon_collider wrote:
| CSS has had a similar issue with naming classes, which is one
| reason for how TailwindCSS's design. I wonder if we'll see more
| AI tools for these kinds of use cases.
| thomasfromcdnjs wrote:
| Yeah this one is crazy cool, not even a designer, but my figma
| prototypes are trash to manager after a while, any approximate
| name is better than my Layer 1231
| raincole wrote:
| I honestly just want a good AI vector icon generator. An i2i one,
| not just text prompting one.
| logicchains wrote:
| What's Figma? They don't make it very clear.
| solardev wrote:
| A popular UI design tool that UX/UI designers often use to mock
| up screens to hand to frontend developers who then implement
| it.
| thecolorblew wrote:
| It's the defacto digital design tool in the industry, so
| they've kind of earned the right to not preface with that
| information
| hamasho wrote:
| What happened to MS Office and copilot integration? They gave
| impressive demos[1] showing users create high-quality PowerPoint
| with good design and content just by prompting a few sentences
| more than one year ago. I don't have Copilot license so I don't
| use it by myself. If someone uses it daily, how good is it?
|
| [1] https://www.youtube.com/watch?v=S7xTBa93TX8
| danfromplus wrote:
| it's quite bad. just makes every slide the same - 1 image + 3
| bullet points and tries to obfuscate by putting it into a few
| different layouts
| troupo wrote:
| It's quite telling that they are showcasing a quite badly working
| AI feature that no one really asked for.
|
| And yet pricing controls that people actually want? Oh 6 to 12
| months maybe
| toddmorey wrote:
| They have not yet started doing any training on user content and
| I applaud them for that. (They've only used public community
| files so far.)
|
| However, they are headed that way to support advanced AI
| features. Quoting Fimga: Two important
| highlights: First, all admins have control of whether their
| team's content data is used for training. Second, participation
| in AI content training is not required to use Figma or Figma's AI
| features. Learn more about our approach to training.
|
| PLEASE, PLEASE make that opt-in versus opt-out. Do the right
| thing here, Figma!
| flappyeagle wrote:
| It's opt out for lower plans and opt in for higher plans
| 542458 wrote:
| Higher plans actually can't opt in. If I look in settings on
| my organization plan I cannot turn on the allow training
| checkbox - it's disabled in an off state.
___________________________________________________________________
(page generated 2024-06-26 23:01 UTC)