Photoshop: Now the world’s most advanced AI application for creatives
By Pam Clark
Today we release a new version of Photoshop on the desktop and iPad. With it we introduce five major new artificial intelligence features. These new features, added to the already robust Adobe Sensei AI and machine learning features you rely on every day, make Photoshop the world’s most advanced AI application for creatives. This smart use of technology enables you to work faster than ever before so you have way more time to be creative.
Today we ship Neural Filters, Sky Replacement, the new Discover panel, and two new Refine Edge Selections improvements all built on artificial intelligence, in addition to many other great new features you will use every day.
Neural Filters is a major breakthrough in AI-powered creativity and the beginning of a complete reimagination of filters and image manipulation inside Photoshop. This first version ships with a large set of new filters. Many of these filters are still in the beta quality state. We’ve decided to ship them to you now so you can try them out and give feedback and help shape the future of AI in Photoshop. Neural Filters is part of a new machine learning platform, which will evolve and get better over time – expanding on what’s possible exponentially.
The new Sky Replacement feature intelligently separates the sky from the foreground and allows you to create the ultimate images with beautiful, dynamic skies in just a few clicks.
The Discover panel is loaded with tools and tips to help you work faster. It uses AI to deliver recommendations based on you and your work and includes one-click actions to speed you to results. This is a great new tool inside Photoshop where we will continue to add AI capabilities that enhance your experience with the product.
Object Aware Refine Edge and Refine Hair both use AI to further improve selections that include tricky subjects like hair or complex backgrounds.
You already rely on artificial intelligence features in Photoshop to speed your work every day like Select Subject, Object Selection Tool, Content-Aware Fill, Curvature Pen Tool, many of the font features, and more. Our goal is to systematically replace time-intensive steps with smart, automated technology wherever possible. With the addition of these five major new breakthroughs, you can free yourself from the mundane, non-creative tasks and focus on what matters most – your creativity.
Here’s everything we released in Photoshop today:
Photoshop on Desktop
Sky Replacement
Neural Filters
https://blog.adobe.com/media_ecda73f0abbcd2a639a365de35df447182e8ebfa.mp4
Neural Filters is a new workspace inside Photoshop that introduces new non-destructive filters to help you explore creative ideas in seconds. In it we have taken many things Photoshop does well and improved them by reducing complex workflows to one click or a couple of sliders using machine learning. You can discover a rapidly growing and improving library of artistic and restorative filters, find a “best” idea for inspiration, and refine your image with the familiar tools you know in Photoshop. The filters in this workspace will speed parts of your workflow, but they might not produce the result you need every time. You can try more creative (and labour-intensive) ideas than ever in the time you booked for your project, and you have all the power of Photoshop when you take your Neural Filters result back to the canvas for final touches.
Change the depth and warmth of just your background with two sliders
We are introducing the first Neural Filters to demonstrate the power of the technology at work in popular uses of Photoshop. Skin Smoothing and Style Transfer, our first ‘featured’ filters, can help photo retouchers and inspire artists and produce some of the most consistent results at this point in development. We have also released 6 ‘beta’ filters, along with new feedback tools for customers to tell us how the results satisfied their creative intent. We want you to try out these featured and beta filters on your images and give feedback to improve Photoshop. Beta filters might work really well on certain types of images, but not as well on others yet. This is only the beginning and these filters will improve with input.
Smart Portrait is one of these first beta filters and helps you transform age, expression, pose, colours, and more. Artificial intelligence analyses the content of your portrait and gives you the ability to change aspects like the facial features in your image. You can use the gaze and head sliders to change the direction of the eyes or head, or light direction to change the angle of the light source. Change hair thickness, the intensity of a smile, or add surprise, anger, or make someone older or younger. The current filter works best on subtle changes, but you can crank it up to let your imagination run wild.
On this image above, I turned the pirate’s head and eyes to the right with the head and gaze sliders and pumped up the anger slider a bit, which produced a subtle snarl.
The images above show the light direction slider. Look at her cheek, forehead and chin. In the middle image I moved the light source to the left. In the far right image I moved the light direction slider to the right. I might want to reduce the yellow cast on her left cheek and a few other artifacts by brushing on the non-destructive mask that is produced in Photoshop when you use Neural Filters. Meredith shows how simple it is in her sneak video at the top of this section. The key here is that I tried out two creative ideas with one slider in seconds, which leapt me ahead in my workflow. This saves me time to get to the end result I need for my project, while allowing me to retain complete creative control.
I shifted the light direction slider to the right. Finishing touches are easily completed in Photoshop
There are multiple beta filters to help you do very practical things in your photo adjustment and retouching workflows. Boost the resolution of smaller selections within portraits with Super Zoom or remove JPEG artifacts that result from JPEG compression. Depth-Aware haze simulates volumetric haze in your background to better highlight your subject.
Here is a before (left) / after (right) Style Transfer example:
Below is a before and after of Colorize where the artist started with a black and white image and the smart technology added content-aware colour in one click.
Colorized a black and white image with content-aware AI technology
Below is a before and after of JPEG Artifacts Removal (note the artifacts in the clouds):
Filters can be applied non-destructively using the smart filter feature, or applied directly to a layer, or generated as a new layer with the changed pixels. Not every filter will produce the perfect results on every image, so you can use all the tools of Photoshop to get the exact look you want by masking out various machine learning imperfections.
I hope you enjoy exploring all the new capabilities. We want to help you bring your vision to life and stand out as a creator, and we know the labour and time pressure of trying to identify that ‘best’ idea in ideation within the time afforded for your project. We built Neural Filters to help you get closer to your final results faster and try out as many ideas as you can before you take the result back to the canvas for final refinements.
Many thanks to NVIDIA for their collaboration and partnership on Neural Filters. We released this new feature to all Photoshop 22.0 customers on all devices, but the performance is particularly fast on desktops and notebooks with graphics acceleration.
I look forward to hearing what you think. Share your work on social using the #neuralfilters tag to help us see your results and give back the likes.
For more information about Neural Filters go here.
Sky Replacement
Starting today, it is faster and easier than ever before to create more dynamic images by swapping in a new sky. Photoshop now knows what’s foreground and sky. You can either select the sky yourself with Select > Sky and edit it to your heart’s content. Or use Select > Sky Replacement, choose a new sky from our database or add your own, and let the new Sensei-powered, machine-learning models do the masking and blending. We use cutting-edge algorithms to harmonize the foreground of your image with the sky so if you change a bright afternoon sky to sunset, the entire image takes on the warmth of the golden hour.
You can zoom in and select just a section of sky, or move the sky around to find the right configuration of clouds or colour (or planets) you want.
We’re shipping with about 25 sky presets — provided by our imaging experts, including Russell Brown and Julieanne Kost. Or you can use your own skies to gently enhance what should have been, or radically change it to the fantastical, with tons of precision and control in just a few clicks.
Sky Replacement is a huge time saver for our customers; especially those retouching landscape, real estate, wedding and portrait photography. The sky’s the limit!
Go here for more information about Sky Replacement.
Intelligent Refine Edge
Photoshop has imaging scientists who have invested whole careers to make selections incredible and virtually one-click easy for you. They have leapt ahead over the past few releases with Sensei artificial intelligence powering features like Select Subject and Object Selection, adding multiple algorithms that smartly deal with tricky hair and complex backgrounds, so you don’t have to.
Today there are two new Sensei features in the Select and Mask workspace, Refine Hair and Object Aware Refine Mode.
Refine Hair: This convenient little button packs a Sensei punch! It’s located in the Options bar across the top of the Select and Mask workspace. It seeks out the people in your selection and automagically refines the selection of their hair. It’s as if you had grabbed the Refine Edge brush and done the strokes yourself. This is especially useful if you have used the Object Selection Tool or Quick Select Tool to select a particular person, and want to refine the hair in a single click.
Object Aware Refine Mode: It’s always been difficult to precisely select hair and other fine elements of an image, particularly when the foreground and background are similar in colour or hard to differentiate like the image above. Now you can click on the Object Aware button to set the Refine Edge mode to make those selections even better, even faster. To demonstrate, I chose a lion with a mane that blends into the background savanna and sky. Using Select Subject and just a few strokes of the Object-Aware Refine Edge brush, you can see the precision I was able to achieve, which took me just a few seconds. The Object Aware algorithm has been trained to understand objects in the scene and thus work better with similarly-coloured or similarly-textured backgrounds.
New Discover panel with Better Search, Help and Contextual Actions
The new Discover panel brings an entirely new learn and search experience to Photoshop. It combines a massively expanded library of in-app learn content, brand new step-by-step tutorials, and a new powerful search functionality.
Artificial intelligence makes the new experience context-aware and provides you with recommendations based on you and your work. These recommendations include tips and tutorials on how to get multi-step workflows done faster, and we have also packaged top workflows into automated one-click Quick Actions that help you remove and blur backgrounds, make a black and white background, or enhance an image.
You can access the Discover panel at any time by clicking the magnifying glass icon in the upper right of the app frame, CMD-F or via the Photoshop Help menu.
You can also use the dialog to find quick links out to Adobe Stock images, the support community, the Adobe Fonts page and more. There’s a lot of great learning and time-savers packed into this new panel.
Pattern Preview
Patterns created in Photoshop often end up on the clothes we wear, in the games we play and used in a huge variety of other outputs.
Creating a perfectly repeating pattern can be tricky, but now it’s easier than ever. Pattern Preview is a document view mode that allows you to envision how your document will appear as a pattern. In this view mode, we virtually tile and repeat your document. You can move and adjust layers in this mode as well so you don’t have to wonder how it will look as a pattern.
Voila! With all the time saved, now you can get around to wallpapering that bathroom with your own design.
Live Shapes
Creating and adjusting shapes should be simple. We have made dramatic improvements to shapes and the shape tools for this release. First, you’ll notice a new tool within shapes to create triangles. Second, we’ve added on-canvas controls to make resizing and adjusting shapes intuitive and fast. Try creating a rectangle, triangle or polygon to see the new on-canvas controls. Third, we have made shapes easier to adjust after they’ve been created by adding new controls to the properties panel.
Beyond these listed changes, we’ve made improvements to the Line tool (this change is certain to be a sleeper hit of this feature), Polygon tool and a whole lot more.
Reset Smart Objects
Over the past several releases we’ve made continuous improvements to the Properties Panel. Our goal is for the Properties Panel to become an indispensable component that enables workflow velocity by surfacing the most common actions and tasks in context.
Now we’ve added another enhancement to the Properties Panel that allows you to completely reset a Smart Object to its original state. Any rotations, transforms or warps are detected and a ‘Reset’ button will restore it.
Faster, Easier to Use Plugins
One of the hallmarks of Photoshop has always been the thousands of plugins created by our community of developers, partners and creatives. These plugins extend Photoshop with an endless diversity of superpowers. With this release, we’re introducing major enhancements to Photoshop plugins.
We’re making plugins easier to discover, manage, and use. Now you can find plugins in the Creative Cloud desktop app, in the new plugins marketplace where you’ll see curated collections, like our Editor’s choice, in addition to all the plugins and integrations we offer. And in Photoshop, your plugins will be right at your fingertips, through a new plugin launchpad.
We’re also bringing UXP, our new modern extensibility platform for building plugins, to Photoshop. The new marketplace and the new plugin architecture make plugins easier to find, improves their reliability and performance, and makes installation a breeze. It also provides a more uniform user experience that better integrates with Photoshop.
We are shipping with a hand-selected group of popular Photoshop plugins built on UXP that are now available in the marketplace across several categories. For example:
- Asset management and stock: Access your work with the Dropbox Transfer plugin, or find 3D assets from Pixelsquid.
- Collaboration: Stay connected to your team and projects with plugins that connect Photoshop to key services like Dropbox, monday.com, Trello, Slack and Xero.
- Image editing: Empower your image retouching workflow with plugins from Pro Add-Ons, Picture Instruments, Tony Kuyper, Greg Benz, and Davide Barranca.
- Utility and other exciting plugins: Work faster by connecting to hardware consoles like Loupedeck, predict what users will see at first glance with AI from 3M VAS, or capture AR objects to edit right in Photoshop with ClipDrop.
More plugins are on the way, including a connection to Google Drive and many other popular experiences that complement your work with Photoshop.
UXP is now publicly available in Photoshop and XD, making it easier to build a plugin that works with multiple apps and we’re working hard to bring it to the rest of Creative Cloud. Any developer that wants to build a new plugin for Photoshop, or migrate from older APIs to the new platform, can do so. You can find more information here and on the Adobe IO developer website.
Learn more about the exciting future of plugins for all Creative Cloud apps from Vijay Vachani, the head of extensibility for Creative Cloud here.
Cloud Document Version History
Versions are automatically created as you work on a cloud document, which is a huge benefit if you need to look back or revert back to prior states. Now inside Photoshop, you can view, revert, open, save as, and name past versions of a cloud document through version history as I did above while editing different variations of a scene.
Cloud documents are now available offline. We have also improved the sync experience so you can view sync status and the amount of data you are using for a sync and how much of your storage quota you have used. For now, this is available only on desktop and will ship in the iPad version in a later release.
Now you can place embedded Photoshop cloud documents (PSDC) in Illustrator cloud document files (AIC). You can already place embedded Photoshop cloud documents inside Photoshop cloud documents.
Preset Search
Over the last few years we’ve taken major strides to help you work more intuitively with presets. Now we’ve made an enhancement to help you quickly find a specific preset: Search!
Search is now included in the Brushes, Swatches, Gradients, Styles, Patterns and Shapes panels so you can spend less time hunting for a specific preset and more time creating.
Go here for everything new in Photoshop this release.
Photoshop on iPad
Get Photoshop on iPad here
Edit Image Size
Change the dimensions, resolution and sampling of your PSD to fit the output that you need. This new capability matches what you can already do on Photoshop on the desktop.
We have added a document properties panel in the top left with gear icon which shows document dimensions, color mode, and resolution. We have added the following specific items to image size on Photoshop on desktop:
- Resampling image (same resampling modes, bicubic, preserve details, etc)
- Preview of resampled image. This will be updated after MAX to also include pan and zoom
For more information go here.
Live Streaming
Now gallery and live streaming options are available within Photoshop. Anyone can livestream from the app or watch past recordings. Here’s how you do it:
- Start a livestream from the “Export” menu on the top right
- Create a livestream of your work and use your iPad’s camera and chat window to interact with Adobe Behance customers as they watch your work live. Toggle camera and chat on and off to match your live streaming needs.
- Live streams will be sent directly to Behance to be viewed live on the web, and recordings will be moderated and posted in the gallery on Behance and inside the Photoshop on iPad app
There’s also a new Behance Gallery inside the app so you can get inspired and view work from others in the community. In it you can:
- View work on Behance that has been created specifically by users like you, for Photoshop on iPad
- View live streams of artists on Adobe Live, and from the new live streaming feature (both live and recorded)
- Browse through all Behance projects created on Photoshop on iPad and curated by Behance staff – get inspired, see and comment on others’ work
- Share your own work on Behance.net to see it show up in the gallery as well (Ability to share directly to Behance from within the app is coming soon after MAX)
Go here for more information about Live stream and the gallery.
Go here for a detailed summary of what’s new on mobile this release. Go here for everything new in Photoshop this release.
ADOBE MAX – THE CREATIVITY CONFERENCE
Luminary speakers, celebrity appearances, musical performances, global collaborative art projects, and 350+ sessions — all at no cost.