top of page

AI in Design: Revolutionizing Embellishments with Matt Redbear

Digital Embellishment Designer Meetup Recap (January 22, 2026)


When we opened the January Digital Embellishment Designer Meetup, you could feel it right away. The room was packed, the chat was buzzing, and the topic was not a casual “someday” conversation anymore. AI is already in the workflow for a lot of designers and production teams, even if it is showing up in small ways, like quicker selections in Photoshop, faster background removal, or using ChatGPT to sanity-check an idea.


That is what made this session so valuable. It was not an AI hype-fest, and it was not an AI panic session either. Matt Redbear showed up with a practical, production-aware perspective and a simple promise: AI is not here to replace designers. It is here to reduce friction, speed up decisions, and help you explore more creative options in the time you used to spend making one.


If you design for embellishment, you already know why that matters. The most painful parts of this work are not usually the “creative spark.” The pain is the bottleneck work that kills momentum: tracing, masking, separating, second-guessing registration assumptions, and trying to predict what will actually happen on press. That is where AI, used correctly, becomes a legitimate advantage.


What follows is a readable, start-to-finish recap of the key ideas from the session, with enough detail to be actionable, but without turning the whole thing into a list of soundbites.



AI and embellishment design belong together, even if it feels weird at first


Matt kicked off with a quick analogy that landed because it was honest. For some people, “AI in design” and “digital embellishment” feel like flavors that should not mix. He compared it to mint and chocolate. Some people swear it does not belong together. Others love it. Either way, the debate is not really the point anymore, because AI is already here, quietly baked into the tools many of us use every day.


The shift Matt wanted the room to make was simple: stop thinking of AI as a replacement for design, and start thinking of it as a design assistant. Something that can help you move faster through tedious steps, generate options you might not have considered, and reduce the hesitation that often shows up when you are trying to design an embellishment mask under pressure.


That last part is key. Designers are not usually short on creativity. They are short on time, and they are often stuck dealing with unpredictability. When a process feels unpredictable, you hesitate. When you hesitate, you lose flow. And when you lose flow, your best ideas do not show up.


AI, in Matt’s view, is less about “making art for you” and more about removing the friction that blocks good design decisions from happening quickly.


Why embellishment design is harder than CMYK design


One of Matt’s clearest points was also one of the most important: embellishment design is not CMYK design.


It is a different mental model. It is closer to spot color thinking than process thinking. It is built on layers, masks, registration realities, minimum line weights, coverage thresholds, substrate behaviors, and press constraints. Those are not “optional details.” They are the difference between a piece that feels premium and a piece that feels sloppy.


In CMYK, you can sometimes “pretty your way out” of a decision. A small mistake might be hidden in the image, or it might get lost in the overall composition. With embellishments, mistakes show up loud and proud. If your assumptions are wrong, the output will not just be “a little off.” It will often look unprofessional, and it will undermine the effect you were trying to create.


Matt shared a recent example from his own week: professional trading card artwork, created by a professional design shop, still failing due to registration assumptions and file setup. That was not a dunk on the designers. It was proof that embellishment design is its own craft, and it punishes shortcuts.


As the conversation opened up, the same pain points surfaced again and again:


Tracing and precise placement take time.

Mask creation can be the slowest part of the entire project.

Tool overload is real. There are too many options and too many “AI tools” that might work… or might waste your time.

Predictability is hard. You hesitate because you do not fully trust what will happen on press.

Creative flow gets interrupted when you are fighting the file instead of designing the idea.


That is the practical reason AI is so attractive in this space. Not because it can “design better than humans,” but because it can remove the bottlenecks that kill momentum.


The real bottleneck often starts with intent, not the tool


A moment that mattered in this session was when the group started talking about what actually slows them down. Ryan Moskun gave the answer that experienced embellishment folks will recognize immediately: the bottleneck often starts before masking. It starts with intent.


What are you trying to do with the embellishment?


Are you drawing the eye?

Inviting touch?

Highlighting the subject?

Or doing the opposite, highlighting everything except the subject so the subject stands out more?


Those are different strategies, and they lead to different mask decisions. The problem is that many designers get handed a file and told, “Embellish this,” without a clear definition of what the embellishment needs to accomplish. You can spend a lot of time making a technically correct mask that is conceptually wrong.


Lisa’s comment landed on the production side of the same reality: tracing and fine-detail placement are still brutal. When a customer wants specific details embellished, there is no shortcut. You are in the art, moving slowly, cleaning edges, refining the mask. It is not glamorous, but it is essential.


Matt connected those points with a practical observation: even when you know what you want, you can still get stuck because you are not sure which tool will get you there fastest. That is where AI starts to matter. Not because it has taste, but because it can help you get unstuck and get moving.


AI is not taste, and that is the point


Every AI conversation eventually hits the fear: “Is this going to replace designers?”


It came up here too. Matt addressed it directly. AI does not replace human creativity. It amplifies it. AI is not taste. AI is not judgment. AI is not story. AI is not context.


Karlien Murray captured the whole idea in one line: AI cannot replace good taste.


Sabine added the practical boundary that keeps everything sane: AI is great as an idea generator. AI is not great as a decision maker.


That distinction is the foundation for using AI responsibly in our world. AI can suggest. It can propose. It can generate options. It can help you explore quickly. But the designer decides what works and what does not.


Matt also made a point that is easy to overlook: one of the most powerful skills when using AI is knowing what to ignore. If the AI response is garbage, call it out. Tell it that the idea is wrong and ask it to explain why. The conversation you have with AI is iterative, and your feedback is part of what shapes the usefulness of the tool.


The “foil placement” experiment, and why it matters


One of the most memorable moments in the session was the question: would you trust AI to suggest foil placement?


The room’s answer was basically, “No… but also yes, kind of.” Not as a final decision. As a suggestion engine.


I shared something I did live while Matt was presenting. I took a screenshot from his deck, dropped it into ChatGPT, and asked where it would place foil. The response was surprisingly reasonable: foil the word “AI,” accent the cat’s eyes, optionally hit whiskers, avoid the snow, and keep it minimal for impact.


That does not mean AI is now your creative director. It means AI can be a fast way to generate starting points and break creative gridlock. When you are stuck trying to come up with three or four variations, AI can help you get a first draft of ideas, which you then refine using actual taste and production knowledge.


This is the most realistic “today” use case for AI in embellishment placement: accelerate ideation, not final decisions.


The most important mindset shift: masks are not technical chores, they are design layers


This was the heart of Matt’s presentation.


Most teams treat embellishment masks as technical artifacts. Something you “have to do” to run the job. “Make the foil layer.” “Make the spot UV layer.” “Make the white layer.”


Matt’s reframing was powerful: masks are design layers. The mask is not just a requirement. It is where a big part of the artistry lives.


Then he went further by naming the pattern he sees constantly in real-world production work. Most embellishment masks are extractions. Designers pull an element from the page and coat or foil it. Text gets foiled. Product images get clear coated. Logos get spot varnish. Those are easy, and sometimes they are correct, but they are rarely the best version of what embellishment can do.


The best work, according to Matt, is interpretation.


Interpretation means you are not simply extracting what exists. You are creating a new layer that adds hierarchy, energy, boundary definition, and tactile logic to the piece. You are using patterns, directionality, and micro-detail to control how the eye moves and how the hand experiences the piece.


Matt showed a case study that made this obvious: a base CMYK image that was blurry and low-definition, then a set of pattern-driven varnish overlays and foil pinstriping that suddenly made the image readable and intentional. It was not just “adding shine.” It was adding structure.


That is a big takeaway for anyone building an embellishment design culture: stop thinking of the mask as the output of a mechanical step. Start thinking of it as the layer where you create meaning.


Designing for 5th and 6th stations: treat specialty layers like purpose-built tools


Matt also shared some grounded principles for designing with specialty stations, especially for digital presses that support metallics, white, fluorescent, and clear.


He treats gold and silver as a light source, not a color. The metallic is not meant to replace CMYK. It is meant to add shimmer and illumination under CMYK or in selective areas that behave like highlights.


He treats white toner as an isolation and control layer. Even white paper has a measurable color. White toner can create a more controlled base, which improves brand color accuracy, reduces paper shift, and can block metallic reflection where you do not want it.


Fluorescent pink is used as an accent, edge energy, and eye-tracking element. It can also act as a gamut extender, pushing certain hues into a more electric range.


Clear is context-sensitive. It can soften dark tones, create satin effects, flatten glare, or behave like a varnish overlay depending on what is beneath it and what stock you are printing on.


This portion of the discussion reinforced a key truth: AI does not remove fundamentals. If anything, it increases the value of fundamentals. The faster you can generate design options, the more important it becomes that you know what actually works when those options hit paper.


The “old tools” still matter, even in an AI world


Before Matt went deep on AI segmentation, he reminded everyone that the foundation still matters. Photoshop masking techniques like Quick Select, Color Range, channel-based masks, Select and Mask edge refinement, and even the old Photocopy and Stamp filters are still incredibly useful.


We laughed about Photocopy and Stamp because it is one of the original “digital embellishment hacks.” And yes, it is still relevant. Both Matt and I still use it regularly because it creates edges and mask behaviors that are hard to replicate any other way.


The point was not nostalgia. The point was that AI becomes another tool in your belt, not the entire belt.


Photoshop Beta is one of the easiest on-ramps to AI


Matt made a clear distinction between standard Photoshop and Photoshop Beta.


If you want stability, use the standard version.

If you want the newest AI selection tools early, use Beta.


He uses Beta almost exclusively for masking because the experimental tools show up there first. If they are unstable, they stay in Beta. If they mature, they move into the main product.


This matters because it gives designers a low-friction path to using AI without adopting a completely new workflow. Many people are “using AI” already without calling it that. Newer selection features, subject detection, background removal, smarter edge refinement, these are AI under the hood.


SAM 2 segmentation: how Matt is actually using AI to speed up masks


The most tactical part of the session was Matt’s walk-through of using Meta’s Segment Anything Model (SAM), specifically the SAM 2 demo, to isolate objects quickly.


The value here is straightforward. Instead of manually tracing complex objects, SAM can isolate them based on what it “sees” in the image. Matt described a workflow where you export the segmentation with a distinct solid color fill, bring it into Photoshop, and then use that fill as a fast selection and masking base.


The reason he likes the “green screen” approach is practical. Green screen is uncommon in real photography, which makes it easy to isolate and select cleanly.


Where this really shines is complex edges: fuzzy hair, flyaways, intricate outlines, and shapes that are painful to do with a pen tool when you are under time pressure.


The honest truth is that you still refine. You still clean up. You still make production-ready masks. But this approach can cut out a huge amount of tracing time and get you to “good enough to design” fast, which is often the difference between exploring three concepts and only delivering one.


Using AI to interrogate long OEM design guides


Another immediate, practical win Matt shared was using AI as a guide interpreter.


Many OEM design manuals are excellent, but they are long. When you are in the middle of a project, you do not want to hunt through 108 pages to find the one sentence about line thickness, trapping, or coverage.


Matt’s move is to upload the guide into an AI tool and then ask it targeted questions. That turns the manual into a usable reference assistant instead of a document you dread.


I mentioned NotebookLM from Google as another version of this, but the tool is less important than the habit. AI is very good at turning big reference documents into quick answers. That is a real time saver and a real error reducer.


The near future: press-aware AI and “negotiation” as the workflow


Toward the end, Matt outlined what he believes is coming next: software that starts by asking how you want a piece to feel, then generates suggestions within real press constraints.


That matters because embellishment design is not universal. Minimum line weights, coverage thresholds, substrate behavior, resolution differences, coating systems, and operator variability all change what is possible and what is safe.


Matt’s vision is AI that is press-aware. You tell it the target device, the substrate, the effect goals, and the constraints. The AI proposes. You negotiate. You refine. You approve.


He described it as “the future of embellishment design is negotiation with AI.”


That is a good way to say it, because it keeps the designer in charge while acknowledging the reality that design is a collaboration between intent and constraints.


Paper-first AI: the next frontier for substrate selection


Jason Leonard from Neenah asked a question that is going to become more common: can AI recommend paper based on the mood, goal, and execution constraints of a design piece?


Matt’s answer was honest. He has used AI to recommend general paper characteristics, like coated vs uncoated or textured vs smooth, but not specific SKUs in a reliable way.


Karlien added the practical reality: paper choice depends on too many variables for AI to guess accurately unless you feed it real specs. Her workflow is to give AI a shortlist with details like weight, finish, coating, sustainability requirements, and the brief, then ask it to rank options with reasoning.


Sabine closed the loop with a reminder that is true in every shop: technically possible does not always mean practically repeatable. Operator skill and process discipline matter. Post-COVID supply chain shifts and material variability add more risk. AI recommendations need to be grounded in real production testing and shop realities.


The takeaway: AI speeds up the work that slows designers down, but it does not replace taste


If you had to boil this session down to one clean conclusion, it is this:


AI is best used to reduce bottlenecks, accelerate exploration, and improve access to technical knowledge. It is not a substitute for taste, judgment, or craft.


The most confident designers in the future will not be the ones who “use AI the most.” They will be the ones who use it strategically, with clear intent, and with production awareness. They will let AI propose, but they will stay in control of the final decisions.


And that is exactly why this conversation matters for embellishment design. We are not talking about generic graphic design. We are talking about a discipline where mistakes are visible, constraints are real, and the difference between good and great often lives inside a mask layer that most people treat like a chore.


If you are willing to treat masks like design layers, use AI to accelerate exploration, and keep your taste as the governing force, you are going to move faster and deliver better work. That is the promise. And for once, it is a promise that feels grounded in reality.


If you have ideas for what you want to tackle next month in the Digital Embellishment Designer Meetup, send them my way. We are going to keep pushing on the topics that help designers create better, more profitable, more repeatable embellishment work in the real world.

 
 
 

Comments


bottom of page