top of page

What should you know as a designer?

Updated: Mar 8, 2023



When it comes to graphic design there are a lot of different things that go into it.


By the time you end up finish reading this your head might be in a completely different universe.


There isn't necessarily one pillar of where to start with graphic design or even motion graphics, You can start anywhere in your journey of understanding but you just have to later understand how to put the different pieces together in the correct order that they should go. Most people do start with Photoshop topically as a point to begin with and branch out from there.


Adobe has an intricate software eco system of graphic design software where the different file types are able to be read from there respective brotheren. Which means you can start with Adobe Photoshop create a a bunch of layers and then hop into Adobe After Effects and then import the .Psd from Photoshop and then turn the individual layers from Photoshop and then turn those layers into individual elements that can be animated just as an example for reference and goes way deeper into how the other programs operate.


Resources


You will also find more resources attached to the specific topic as well.


 

Jump List Topics

 

Displaying Color


When it comes to your computer monitor, your mobile phone, your laptop, TV they all have various different properties that make up what it is that you're looking at and representing the color accordingly that is clear and accurate.


Dealing with color alone is its own rabbit hole I have made a single specific page on it on my website called Color science and that is just the highlights of the beginnings of the absolute understanding craziness that goes into how radically different colors are.


One great place to start for picking colors that you may want to use for a particular project would be Adobe Color It is a free resource that has been beautifully designed for being able to pick a color along with being able to explore other colors that people have picked out that are harmonious with each other.


When you print something out always remember there is no such thing as white ink.

printers use additive printing techniques which means anything containing white is subtracted from whatever it is that you are printing.



If you are doing a all over print for a garment and clothing that technique is using sublimation which means the garment that they're printing on is white so if you're wanting to print black then you are combining all the colors together to produce the color black, and sometimes when you're combining all the colors to make black in a design of a project that you're making then when it comes down to it sometimes on the manufacturing of it or washing it once or twice then the color that makes up black will become most likely sometimes looking like a dark dark blue tint and doesn't look as authentic as it was before you could have washed it.


When it comes to viewing media on TV The viewing color accuracy is dependent on many different things. is the panel that you're viewing it on


  • traditional OLED panel?

  • Is it using Quantum Dot OLED technology?

  • Is it an LED screen with IPS?


There are a lot of different factors that play into the color accuracy representation as the film or media was intended.


What was the bit rate of the media encoded for having a crisp clean sharp image or was it encoded at a lower bit rate whereas the compression was very heavy but to skew the color accuracy representation of the media.



Keyframe Animation


Keyframe animation is actually pretty wildly easy to accomplish. It harkens back with the history with essentially what used to be frame by frame animation with old style cartoons. with the avid computerization of technology doing frame by frame animation is the thing of the past where it doesn't take you a large amount of time to accomplish animation with any sort of complications that you want to achieve.


With the modern advancements of no longer having to go frame by frame for how something should be animated keyframes make it to where all you have to do is tell the computer what two or more positions or properties in object/ imagine asset contains and then the computer will do the rest of the work to create a smooth animation affect to do all of the work to transition in animate what the first and last or more state of that property that you told the computer or software to compute how something should start and then how it should end.


A simple idea of keyframed animation could be as simple as taking a car 3D model and having every single individual component of that vehicle with every individual part exploded apart and being able to tell the computer individually where each part starts and ends and then all you have to do is press play and that that work is done is wildly impressive in modern times and that's how something as simple as that can be.


You can get even more advanced with your keyframes and play around with the F Curves of that animation that the computer has done to either speed up or control the speed in which the animation is being shown.



MP4 - AVI - MOV - MKV


These are four popular codecs or video wrappers for encoding video data with that can include audio.


I can break it down pretty simply here.


  • MP4 great flexibility for streaming video with compression and for easy use online with optimal bit rate that isn't too high.

  • AVI is lossy which can have large file sizes and most common with DVD movies. Alpha channel support.

  • MKV files have moderately large file sizes not as large as AVI but used with Blu-ray DVDs for high bit rate quality.

  • MOV common with Apple devices also has support with alpha channel support, along with having the same size of an MP4 most of the time, requires Quicktime on windows for codec use.


If you're also interested in what ProRes offers as well it was a codec created by Apple themselves for professional video use. It has alpha channel support, I haven't personally explored it due to how I'm not familiar with it at all.


To me I absolutely like when a file format standard has alpha channel support I have a habit with Adobe After Effects when I am making my 300 layer music videos I want to be able to have the ability to export my pre-compositions so that the preview times of cashing the frames within Adobe After Effects is able to be sustainable and not have to wait 5 minutes to wait for 300 layers to be computated by Adobe After Effects.



Linear vs Nonlinear Editing


Linear Editing

  • Think of a Single Video Track

  • Single Audio Track

  • Limited Effects Layering you can stack

Nonlinear Editing

  • As many Video/ Audio/ Adjustment Layers that you want (that the program can manage)

  • Tons of custom effects

  • Layering all different types of footage


When it comes to understanding linear editing versus nonlinear video editing the ultimate tool to use is a nonlinear video editing software so that you are not confined to only being able to use 1 Video track and 1 Audio track.


If you are comfortable with using a linear editor for editing video you can still have the same experience in a non-linear video editing software. It's just that non-linear video editing gives you way more benefits than linear video editing does.



JPG vs. PNG


Jpeg's are a compressed image format. Png's are Lossy which means there isn't any compression.


Converting a Jpg into a Png doesn't necessarily make the Jpg image any clearer or higher quality. It just preserves the photo as it is.


Converting a PNG into a JPG ultimately compresses and degrades the quality of the image and removes the additional benefit of alpha transparency.


The more that you continue to work within a Jpeg to overwrite and compute a new version of the image, It begins to grow artifacting around certain edges that is used for compression, called compression artifacts, These look like blocky multicolored squares.


When editing a PNG image you can edit it as many times as you like save overwrite the file as many times as you like while working with it and it will retain the same clarity and lossy look of not being compressed or growing compression artifacts unlike Jpeg's do.



PDF Illustrator SVG Graphics


When it comes to the world of Vector graphics there's a whole different characteristic to it that sets it a part from JPG/PNG bitmap graphics. Bitmap graphics are pixel based and when you want to scale up the image the photo is bury and is not recalculated. Vector graphics on the other hand can scale to infinity zero with lost of quality.


With Adobe Illustrator you can open PDF's and have every single every individual element as its own respective layer element. Talk about how cool that its. Not even Photoshop can achieve that level of detail. Photoshop will just flatten the layer and elements.


PDF's can also store Photoshop layers as well.



Black Video to Alpha Transparency


When your using After Effects there's 2 ways to use footage that has a pure black background. The first and easy way to use


Color Blending Modes - Learn about all the different color blending modes there are.


There's only 2 modes that will remove Black from the piece of footage that your using

  • Screen/ Add blending modes.

  • Unmult - Free Plugin that removes all elements and shades of black and turns into alpha for even better editing ability.



Mac OS Vs. Windows


When it comes down to it you can use Mac OS and have a great experience while designing for your clients.


OSX

  • No Support for Nvidia cuda hardware acceleration

  • Newer ARM M1/ M2 CPU configurations are limited a set amount of memory for the lifetime of the system

  • Locked into the Apple ecosystem that leaves optimization of software to Apple or the respective software you want to run by the developers ability to make use of the configuration you are using.

  • Final Cut Pro X/ Motion X exclusive software for Apple only


Microsoft Windows

  • Nvidia Cuda hardware acceleration

  • Older SLI/Crossfire Multi GPU support (dependant on your hardware)

  • Better codec support with using mp4 hardware acceleration

  • High modding ability to the operating system

  • Memory configuration changeable at anytime during the lifetime of the system

  • No Ecosystem to lock you into one particular ecosystem that can make it difficult to modify how Windows operates if something doesn't work correctly


Even tho Apple has developed the Mac Studio with the there newer ARM processors the difficult pill to swallow is that within 2 years more when Apple could come out with a newer edition of the hardware Apple does make it to where migrating over to new hardware is seamless but the thing that no person ever wants to experience is when your computer for Apple is over 10 years old your operating system is tied to the hardware your running that then allows Apple to make a simple software check to say "oh you want to install our new operating system to take advantage of a new software and hardware improvement?" oh your need to fork over $4,500 for and example and making your old system obsolete, even tho it can still run that new operating system update just fine they just choose to force that illusion to keep pumping up that record high company evaluation of 2 Trillion+ dollars.


You can make older 2010 Late Model $3,000 iMac run the newer operating system by just replacing the Internal Hard Drive with a solid state drive (SSD) and then use a modded OSX Catalina install then your off to the races using an old iMac in a modern Apple environment there no reason to make old hardware deemed E-waste just because a 2 Trillion Dollar company says so? it's such a waste.


Ever since 2018 and beyond Apple has been down this path of non-replaceable non-upgradable parts. Their newest developments with their ARM architecture processors are fascinating but at the same time have a huge downside to where if let's say with if you were using the Mac Studio and your internal drives wear out completely, it is very difficult to end up trying to replace them Apple has created a system with the Mac Studio design to where if you are messing around with the internal drives of the hardware let's say you got a 1 terabyte addition and you wanted to go and later say you had the money to change it out for two modules going all the way up to 8 terabytes. Apple has tied in serialized the components of the drives that make up the 8 terabyte pool of data paired with the processor that you configurate the computer with. Leaving your only option to cry to Apple for Mercy and to re-pair the drives to your processor due to the storage drive controller being embedded into the Custom ARM processor that they created. This isn't even a service Apple even offers and will leave you helpless by Apple and will then say that you voided your warranty because Apple didn't expect users to want to later upgrade the storage in there Mac Studio device and tells you to resort to external storage for expanding the storage on your product.


When it comes to Microsoft Windows you have full control over how your hardware operates on terms of if you're building you own custom desktop workstation vs. buying a premade laptop. Along with the fact with having control over your hardware It is very nice to be able to allow you to upgrade your dedicated GPU whenever one is released by the particular company you favor whether if that's a Nvidia or AMD.



Nvidia Vs. AMD


When it comes to doing graphic design ultimately Nvidia does take the crown a lot of the time because of how Nvidia has been crowned king in the dedicated Quadro and Server market. Both GeForce GTX/ RTX and AMD Video cards are capable of Video Games that's a task that they don't have an issue with. When it comes to doing graphic design and actually trying to do 3D work and complicated rendering you're better off sticking to Nvidia for that task.


AMD has very little involvement that I know of in actually being used for real time 3D path trace rendering with programs such as with if I want to use Octane Renderer by Otoy then they have specifically chosen too for a long time focus on Nvidia cuda acceleration for their development of their 3D rendering plugin for all sorts of different 3D softwares Maya, Cinema 4D, Blender, Etc., If you want to use Octane Renderer with Apple you are not going to find that at all in the slightest and ends up being a Microsoft Windows software exclusive.


Apple isn't all that interested in actually creating external dedicated GPUs for their products, they're focus is on ARM CPU development. Apple has created a acceleration card for one of their Mac Pro Towers but it's only use is for ProRes acceleration that Apple created for their own use with ProRes Codec so it has a very specific use case for accelerating that one codec. It was built for and is not design for use outside of decoding or encoding ProRes footage.



3D Software fundamentals


When it comes to working with 3D software a lot of the functionality at its core is all very similar. The user interfaces can be mildly different between the different software programs but there is plenty to do when it comes to using 3D software.


My software choice is Maxon Cinema 4D due to how the interface is laid out just makes sense when you're using the software for the first time.


There are other applications that don't cost money such as Blender. Blender is a one trick pony to everything 3D, VFX, Color Correction, Compositing, 3D Sculpting, 3D Game Engine, Video Editing. It is a lot that Blender can do and that it does a lot of it very well. For me I personally feel that the learning curve for blender is very steep unlike Maxon Cinema 4D and that it also doesn't have a focus approach at doing one thing and that one thing very very well.


Along with the complexity of Blender I was able to feel way more comfortable with Maxon Cinema 4D way faster than I did with attempting to try to use Blender for something very simple.


If you're wanting to get into 3D software and using it effectively I can definitely say using Maxon Cinema 4D you're going to have a way better time in than racking your head against the wall with Blender with just how ridiculous the interface is and how ridiculous all of the shortcuts are for Blender.


When using Maxon Cinema 4D the .c4d file format is a proprietary format, though at the same time the other way to get around having to deal with that if you're trying to get into Maxon Cinema 4D and let's say you were in the middle of using another 3D program then there is a solution in way of called using FBX or with Alembic file format.


  • FBX was created by Autodesk and is primarily used for exporting animations. Primarily used in video game engines. Supports geometry meshes, materials, textures, animation

  • FBX is great for preserving rigged character meshes and weights.

  • Alembic files allow you to export Animations, Materials, Multiple Geometries, Cameras and Lights.


Alembic can be the better successor to FBX.


Webp Image codec


Webp is an image codec made by Google. With this newer codec its able to achieve 25% to 34% smaller file sizes compared to its jpg equivalent. Photoshop does not natively open Webp images and just needs a small plugin to be able to read the format.



HEIC Vs. Jpg


HEIC is a image codec created by Apple in 2017.


HEIC stands for High Efficiency Image File Format.


  • Half the file size of JPG

  • Better Image quality

  • HDR (High Dynamic Range) support

  • 16 bit color depth

  • Depth 3d data from the camera sensor

  • Support for Transparency/ Alpha Channel

  • Non-destructive edits

  • Apple Live 3 sec photo image seqences


JPG/JPEG created in 1983 created by IBM, AT&T, Canon Camera Company, Mitsubishi Electric.


JPG/JPEG stands for Joint Photographic Expert Group


  • No HDR Support

  • No Transparency/ Alpha Channel

  • Compressed Images

  • 8 Bit Color depth

  • Only destructive editing


As you can see HEIC has huge benefits for a more modern image standard. The only single down side to HEIC is that there isn't any legacy support for devices that haven't been created to support the image standard normally. Apple devices natively support HEIC vs Windows, Android, Other devices that can't read the HEIC format. You have to if you want to add the functionality to a device for HEIC to work add support for it by installing the codec support for it.


JPEG is also not well suited to files that will undergo multiple edits, as some image quality is lost each time the image is recompressed, particularly if the image is cropped or shifted, or if encoding parameters are changed – see digital generation loss for details. To prevent image information loss during sequential and repetitive editing, the first edit can be saved in a lossless format, subsequently edited in that format, then finally published as JPEG for distribution.


Layer Editing Vs. Node Based Editing


Adobe After Effects is a Layer Based Compositor. Nuke X by The Foundry made by Blackmagic is a Node Based Editor.


There are Pros and Cons to Layers vs Nodes. A lot of Hollywood Production companies do use node based editing for there compositing. Adobe After Effects does do a lot of things very well but there has been over the years a lot of stagnation with the development of how After Effects has been continuously developed. The beginning of After Effects and how it has been developed is based on single core performance where every single frame is needing to be hammering the single core of your CPU for Previews and Rendering. Only until the release of Adobe After Effects 2022 has there been the newer development for Multicore and Multi Frame rendering and Exporting, Which means that then now all and every single plugin has to be updated to support the Multi frame Multicore rendering support. Adobe has really dragged their feet on how they been developing there creative suite. Before back in the day you actually could out right own your creative programs with a license that would not expire. But with the drive for greed and the need of pushing for the need of subscriptions the ability to own your software has been taken away. This means if Adobe's software license servers are offline if Adobe ever dies then that means all the subscriptions that people are paying for are useless and makes there software a paper weight. When you take away the need for innovation by making your development dependent on peoples subscriptions then the company doesn't have to worry about next years new version that they have to push people to buy and just be on cruise control and go 6 months without updating their software. When it comes to node based compositing it has loads of benefits you can't get with After Effects. With node based compositing you can link to out of composition elements that can be dynamically updated in the original comp that its referenced in. After Effects is unable to achieve such a feat. You can do pre-comps inside of your main comp with in After Effects but you have to if you want to update a pre-comp with in After Effects your going to have to manually change that comp that nested inside of it if you want to see it change.



Desktop Workstation Vs. Laptop Editing


When it comes to using a Desktop workstation versus using a laptop to edit and trying to do the same exact stuff that you want to achieve with your Desktop with a mobile form factor many years ago that wasn't all that possible as much as it is today being a reality.


  • Desktop CPUs generally have the full fat blueprint on silicon for the highest performance possible achieved for that architecture and generation of processor made.

  • Laptop processors generally are a cut down more simplified version of their desktop counterparts.


In modern advancements we now have laptops that are able to be more on par to desktop components. There are some gaming laptops that have even had the chance in limited production of adding a Desktop processor in the Laptop with a thick chassis and big cooling to keep it under control while still being mobile.


When I want to use my Desktop on the go and I have my Laptop on me I will use Anydesk to remote into my Desktop for anything I might want to do. While at the same time I can still let my Desktop work and operate while I am on the move.


With the addition of Ultrabooks they are a high-end addition to being able to do complex heavy editing at times but for long periods of time doing heavy tasks they're not all that great at that task, but still achieve it and just produce a lot of heat while doing it.


There isn't anything wrong in using a gaming Laptop as an actual content creation machine a lot of the times gaming laptops aren't designed for


  • color accuracy from their display is terrible

  • don't last a long time on battery normally only achieving an hour on one charge

  • having a terrible webcam quality setup that is very muddy

  • decently loud speakers tuned for a high-end sounds not ideal for movies or watching videos

  • touchpad that can be terrible to use when not using an external mouse essentially not as being feature-rich as other touchpad Laptops will have lots of gestures and features for navigation around the operating system


When it comes to using a gaming laptop as a content creation device those are some of the typical downsides that can affect how you use the device. You're going to expect to always have to be near a plug anywhere you take it. You're going to want to bring headphones so that you can use your gaming laptop in private so that the speakers don't disturb others.


Building a great Desktop workstation is ideal for anybody who is about wanting to not have to wait 10 minutes for a video to export from their favorite nonlinear video editing software.


Ideally you will have better understanding of your Desktop workstation if you build it from scratch. For me when I was getting my parts for my desktop I cared about the future upgrade ability of it.


  • Consumer laptops max configuration normally is 32gbs

  • Consumer grade desktops is 64gbs memory max configuration

  • Prosumer grade parts 128gbs memory max configuration (Intel X99 Chipset)

  • AMD threadripper over 256gb memory max configuration

  • Servers can be configured to 2 terabytes of memory for their respective configuration and use


When it comes to memory usage more the merrier ultimately. I still have an X99 system to this day that is 7 years old and is still running brand new and runs absolutely fantastically. It has support for multiple GPUs which is great for linearly scaling workloads with GPU rendering involved such as with Octane renderer by Otoy. X99 allows you to use two GPUs with having 2 full fat x16 bandwidth for both of your GPUs that you slot into your PCIe slots.


Modern alternatives for having lots of PCIe bandwidth comes down to building a full new system with AMD threadripper, due to the fact that Intel has decided that the industry should focus with using a single GPU configuration due to the fact that other industries such as in game design would spend a lot of time dealing with people who would have multiple GPU configuration issues and that it ended up being that it wasn't worth all the hassle developing for a small niche set of people who want to put thousands of dollars into their machine when most people really only spent money on a single GPU for gaming.



Video Conversion


Video conversion can be both an easy task for anybody but also turn into a scientific disaster if you don't know enough of what you're doing with understanding the benefits and limitations of what you're changing a file format to.


Let's talk about the different reasons of why you would want to convert a particular video to a different format in the first place.


Low bitrate footage and then trying to put it into a video conversion software to try to increase the bit rate isn't going to magically make the content any clearer. This is equivalent to the term of garbage in, garbage out terminology.


Bit rate is the term for the amount of quality that is being average per frame of footage. kind of like how you have the different type of quality when saving out a Jpg with Photoshop and having a slider that determines how clear the video is. Video is just a slideshow of images contained within a video wrapper with audio. If you have 30 photos all together you need to determine what the quality is these photos that are embedded within the video file. Bitrate helps us determine what that is essentially.



There is a lot of science that goes into video conversion. The good rule of thumb for it is that you wouldn't try to take a 480p video and try to convert it to 4K resolution.


 

Summary


There is still plenty to go over with many more topics within this category with graphic design that can be talked about.


So much about what I've talked about in this post is all the stuff I have learned over the years and dived deep into rabbit holes on all these different topics that I have written highlights about essentially.


When you are on the higher functioning part of the spectrum then You will spend every waking moment trying to figure out the full picture of all this stuff and how it all interacts with each other.


I look forward to making a part two follow up that expands upon what I have already talked about here. Along with more resources.

Comments


bottom of page