Another week and another assignment for my Intro to Music Production class with Coursera.com. I’ve really enjoyed this class but I must admit it’s been a real challenge trying to get all the information to stick in my head.
This presentation discussed the “Usage of the most important synthesis modules.” Just click on the Presentation Art below to make the jump to view my prezi. I hope you enjoy it!
Greetings music lovers! This week my assignment with Coursera for the Introduction to Music Production class is to “Compare and contrast an algorithmic and convolution reverb. Demonstrate the difference and the important features in both types of reverb.”
I’ve tried on several occasions to embed my presentation but have had not luck. Please visit my prezi by clicking the Prezi Artwork below to view the assignment. And, thanks for visiting and taking the time to read through my work.
This is the fourth assignment in a series of posts I’m writing for the online Music Production class I’ve been taking. For this week’s assignment I’ve chosen to prepare a presentation to “Explain distortion and give examples where it can be both musical and problematic.”
Click on the Prezi Artwork below to enjoy my presentation and perhaps learn a little something too.
God bless and thanks in advance for any input you might like to add.
In my last post I briefly took a look at Digital to Analog Conversion. Today I’d like to discuss effects. Not guitar pedal effects, which in my case would probably make more sense to those of you who know me well, but Digital Audio Effects used when configuring a digital mixing board, their categories, plugins and properties when using a DAW (Digital Audio Workstation).
This is the third post in a series devoted to completing assignments for an online Introduction to Music Production class. I hope you enjoy reading about what I’m learning and perhaps get some learning along the way. Any input on your part is appreciated. Thanks in advance for taking the time to read through the material.
Categories of effects: Teach the effect categories including which plugins go in each category and which property of sound each category relates to.
Categories of Effects: Plugins and Properties.
The process of recording, mixing and editing music has come a long way and those that have gone before us have paved the way to great music production by giving us some pretty awesome tools or plugins that help us get the sound we’re hearing in our heads into the airwaves and into the ears of our audience. The complex spectrum of Audio Effects at our fingertips is simplified a great deal when we understand their categories and the most appropriate way to configure them into a signal flow based on their uses.
Digital Audio Effects fit into three basic categories in digital processing that relate directly to some basic elements of sound itself. These three categories are
Category 1: Dynamic Effects
Category 2: Delay Effects
Category 3: Filter Effects.
Dynamic effects plugins- generate amplitude over time. You may recognize these effects as gates, compressors, expanders and limiters and can give the listener a sense of emotional intensity or help the music “tell the story” by increasing or decreasing the dynamic.
Delay effects plugins – Sound propagation or the speed at which a sound travels through and around objects can be simulated in the DAW to give us a sense of space. Delay effects, like chorus, or phase and reverb as well as the flange make a recording sound as though it were played in a large or small space. If you want your audience to get the feeling they are in a concert hall or perhaps outdoors delay effects can accomplish it.
Filter effect plugins control something called timbre, (ˈtambər) or particular sound quality of an instrument such as a trumpet or violin or a voice. When you adjust highs and lows in the DAW you are using filters. The most common filters are the parametric and graphic equalizer or EQ. Other Filters include high, low and band pass filters.
My first assignment was to discuss signal flow in a home production studio set-up. Part of the signal flow which I did not discuss in-depth included the flow through the DAW itself. Knowing where to position which effects can help a lot when producing music especially when mixing multiple tracks.
For instance, lets assume you’re mixing several background vocals and you equalized them carefully but now you want your listeners to feel as though the singers had performed in a great cathedral. You’d want to add a delay effect plugin. Trying to mix delay into each singer’s track individually and keep it consistent between the tracks would take some time to accomplish but if you routed those tracks into one sub-track you could filter them all at the same time, equally, and get that cathedral sound without all the fuss of individual mixing for that plugin.
So, you see, having an understanding of when and where to use which effect can make a huge difference in time management in the studio as well as improve accuracy and efficiency in the processing stages.
In reflection I’ve learned so much as I’ve contemplated and researched this topic. My appreciation for those who have a great knowledge and understanding of this topic. Learning these categories and knowing where the plugins fit helps me get my head around some complexities that would otherwise be out of my reach! And, in the end it’s not so overwhelming.
Thank you again for taking the time to read through my topic and for sharing your knowledge with me!
Hello musicians and friends! My name is Cosima and this is my second assignment for Intro to Music Production online at Coursera.org. For this assignment I’ve chosen to discuss the analog to digital conversion process. I spent some time reading up on the process and enjoyed learning something new. I hope my post will spark some interest in this topic for my readers. Thanks for visiting my blog and reading my post.
Analog to digital conversion process
In my last post I indicated that the source of an audio signal in my studio generally is a voice. The sound of that voice affects the air and creates longitudinal pressure variations that are picked up by a microphone which converts those variations into voltage variations know as an analog signal. That’s great for live performance but if we want to send that signal into a computer’s digital audio workstation (DAW) we’ll need to convert the analog wave signal to a digital signal or data.
The only thing the computer can deal with is strings of numbers. Things represented in 1 and 0s, called binary information. So, there’s a process to go from the continually variable sound into a stream of ones and zeros and that process is called a sampling process. An analog signal is a wave form that is a continuous stream of data that the computer can’t recognize whereas digital data is discrete or individually separate and distinct. To convert the analog wave into digital data of ones and zeros I’ll need to use the Analog to Digital converter in an audio interface device.
The audio interface uses a common method that converts analog to digital that involves three steps: Sampling, Quantization and Encoding.
The analog signal is sampled at an interval rate making many, many measurements per second. Most important factor in sampling is the rate at which the analog signal is sampled. Over 40,000 times per second to be able to accurately represent the continuously variable signals in the air as a digital representation. And the higher the sampling rate the higher frequency that can be represented accurately in the digital domain. And this frequency is known as the Nyquist frequency, just half a sampling rate. So a sampling rate of 44,100 hertz can accurately represent half of that in the digital domain, 22,050 hertz. The human ear can hear a range of about 20,000 hertz and the CD standard sampling rate is 44,100 hertz which will accurately represent everything we hear as human beings.
Sampling yields a discrete or individually separate and distinct form of continuous analog signal. Every discrete pattern shows the amplitude, the extent of a vibration or oscillation, of the analog signal at that instance. The quantization is done between the maximum amplitude value and the minimum amplitude value. Quantization is approximation of the instantaneous analog value.
In encoding, each approximated value is then converted into binary format of 1s and 0s the computer can then recognize which we now can manipulate in our DAW for the purpose of music production.
Thanks again for taking the time to read my post! Please feel free to leave comments. Your input is appreciated.
(sources include http://www.tutorialspoint.com and wikipedia)