Kontrol
Choice Morsels from a broad Kitchen.
Thursday, 2 June 2011
Final showing of Composition for tutors and hand in
Annabeth and Mike sat in and watched a screening/performance of my final piece. This was recorded for reference purposes (Assessment and marking).
Done - THURSDAY 2nd JUNE
For hand in
For - FRIDAY 3rd June
Evaluation
This has been the first project completed that there has been an opportunity to really test myself and see what my applied capabilities are at this point in time. Exponentially more ground has been covered here than in previous projects, and I feel this has been due to it being self-directed, as well as a complete shift in my attitude towards work generally. Nothing to hide behind and no one to blame, any shortfalls lay firmly on my shoulders. This so has never quite been the case before, although it should have been.
It has also been a project unhindered by the sort of external forces that were experienced earlier in the year. (Family, break-ins and Finance)
The first indication of this progression has been my ability to remain well within my time management program. I made a point at the end of each week of seeing were I was in relation to what was scheduled, and on all occasions the necessary tasks were completed, to a higher standard with days to spare. Usually, this would have meant a decrease in workflow and productivity. However, with this project I took it upon myself, as I should have done in the past, to push even harder, utilising spare time to research into industry or experiment with new techniques. Refining elements of existing motion graphics and audio to create something that far surpasses anything I have achieved so far. Some of this was due to having to learn new software, however, this was not an issue, more a creative challenge. It enabled me to view the negative aspects that can occur during a project, and transform them into positive ‘games’ for me to laterally approach, engage and work around. This has filled me with confidence for all future endeavours, but especially for the format of the BA Hons top up year, in which all aspects are self-directed.
In terms of research, I learned from previous projects that general research is used for ‘mental alignment’ with the area of interest, but should be kept as brief a process as possible. So much time can be wasted here, but this time, I was able to focus my research to contemporary practitioners. Due to private investigation, I already knew the fundamentals of or had seen live. (Anti VJ, Quayola, Ryoji Ikeda.) Enabling me to use these artists as a lever to investigate new, more localised practitioners, non-profit organisations and collectives such as Test Space, OneDotZero, CornerHouse, Lovebytes and Light Night etc. and really see what is out there in terms of current trends at the moment.
The optimism felt through finding that these organisations exist and are more than willing to offer industry level advice and assistance and are enthusiastic for new creative ventures. But also, that there is a thriving A/V scene in Leeds that is both proactive and inclusive.
I have been disappointed to not be able to enter both the OneDotZero and Light Night festivals/competitions due to the deadlines for submission not correlating with my current timetable for finishing work. However they are both firmly on the cards for next year, as well as many others covered in posts on the blog.
I have contacted Test Space Leeds about getting involved with some of their events over summer, and have recently been informed about Lovebytes and Jon Harrison, who Annabeth has contact with, suggesting getting in touch ASAP about a potential internship over summer. This could also be a potential avenue for attaining a mentor too, another facet of the BA Hons course that can help me proper as a designer. If this facility is taken advantage of early, it is only going to increase the odds of a job in industry and developing exponentially.
The awareness of industry and practitioners has increased, but I let myself down through being lazy in contacting these people and unfortunately, due to incompatible timetables, have been unable to engage fully with this avenue of enquiry. It has however opened my eyes to an industry standard of work and application of skills.
Making aesthetically minimal audio-visual content has been something I have wanted to do for a while and the reward for doing so has been tenfold. This type of audio is something I have been creating so really though nothing of it. I am happy with the levels of production, quality of samples and expression through MIDI I have found manipulating the length and velocity of hits, as well as having a sample library anyone would be proud of to fall back on. The track was created, to the best of my ability, however I have viewed it as a vehicle to attempt MIDI controlled visuals. Doing the audio again, more reduction is needed. It is still too busy and has a lot going on, however the overall feel of the track is still charming and seductive.
Learning Node based visual programming has opened up an avenue I had not even considered before this year, and through investigation, have found that this is the software and methodology that practitioners of the moment are using. Kompact, Hawtin, Ikeda and Quayola all use variations on this theme. Whether it Max Msp, Max for Live, Jitter or Quartz, this appears to be the way to achieve my aesthetical, and most importantly, industry orientated goals. These are the areas I would like to investigate next year and beyond.
The learning of this new software/language has been the most time consuming factor in this project, and has had a lot more of an impact on its outcome than I anticipated. However this has to be allowed for given the time frame and nature of these shorter projects and has been approached as another creative challenge.
Although this is the best of my work to date, the piece has many floors, that if tackled again would be done differently. For instance being able to manipulate the camera, especially along the z axis would have completely changed its dynamic and feel. I have seen this technique used before to great effect, but this lay beyond the limits of my capabilities at this point. This will be my next challenge with learning Quartz.
Members Kineme.net forums suggest that there is something within their ‘GL Tools Patch’ (£36) that enables this facility, but I had to forgo this as time became a factor towards the end of construction.
Another thing that I would do differently is reducing the number of artefacts that comprise the piece. The screen is too busy to be considered ‘Minimal.’ However I do feel that this is a piece of reductionist work, with a minimal aesthetic. The main aspect of minimalism that exists within the piece is the rules and guidelines it responds to, and the audio somewhat resembles an ‘on-trend’ Minimal Techno Track. Stricter rules and a more ruthless approach to composition is what’s needed to refine these ideas.
The only other major issue that I haven’t been able to solve due to financial issues has been exporting content from Quartz. Trying screen capture software, there were to many frames per second dropped, and upon filming an HD screen, it was difficulty to get the right colour balance on the camera and important detail was still lost in the translation. This meant remaining flexible, and re positioning so that the final outcome would be performed as a live installation piece. (Viewed Thurs 2nd June) Although this was not ideal, I feel the piece works really well live, as the dynamic in the audio and the effect of the visuals have a far greater impact when ‘experienced’ live, on the correct speakers or headphones. (More control over directing the observer’s experience).
Export issues will be tackled more thoroughly over summer for future projects.
Therefore, through this project I have firmly identified a specialist area of study, including software and equipment, informed myself of Industry practitioners and engaged them via email to extort as much information as possible out of them, implemented and executed an effective time management program that has actually worked and produced a piece of work I am proud of.
Rational 6 - The statement of intent.
Finally, I wanted to round up the loose ends in things accomplished, unfinished and disregarded and the reasons why.
The statement of intent.
The descriptive title the project was given was ‘ an expression of abstract construction: Minimalist sound,’ and I set myself the challenge of creating a piece of a/v that adhered to a minimalist frame f reference. In the SOI, I wanted to include far to much unnecessary theory much of which was dropped from the start. These elements are mythology, synaesthesia and colour theory. Although colour was used, there wasn’t much ‘colour theory’ involved. I decided to represent the audio with colours that I felt best suited the sound. Therefore any semblance of colour theory, apart from colours that work together was dropped for a more naturally evolving approach. This has been due to the fact that I spent more time creating the audio than expected, and wading through semiotics and the semantic intent of audio ended up taking precedent. I do not feel the piece, or project suffer in any way, however it has left a whole new area of communication open for future endeavours and I will definitely be returning to look at colour in much greater detail.
Wanting to include mythology, in a minimalist piece of a/v was a naïve aspiration also. Nothing within mythology sat with the desired aesthetical results and for this reason was dropped as a line of inquiry early as not to waste time.
The main areas that were explored were semiotics, symbolism, suggestion and phenomenology. As with most of the practitioners I researched, I found myself reducing sounds to their essence and constructing them in a visual form with the most appropriate shape, colour and movement that described its sound. Using shapes and lines, it partially keeps the content open to interpretation, using the bare bones of a conceptual idea, to suggest that these shapes MAY represent these sounds. (To me they do, but others may have a different interpretation) Therefore, wrapped up in this notion is an open dialogue between the artist and the observer, in as much that I have suggested a ‘pallet’ or a motif for the observer to interprete. Highly subjective and certain aspects could be perceived as lazy. However I have tried to use everything within my power to prevent this from being the case, using particle systems, interpolators and generators from within Quartz to really give the audio a visual dynamic that is not lazy, but considered and natural. I feel this is apparent within the piece.
In terms of software, I have utilized most of the proposed items. However I have not used Modul8 and illustrator.
I was planning to use Modul8 in post to manipulate, live or otherwise, an exported Quartz composition. But with the complications in exporting from Quartz, this has had to be sacrificed, as I haven’t had the content to manipulate.
This, although a disappointment, has been a good thing. It enabled me to focus on getting the most out of Quartz, in correlation with my technical ability, but also I feel the piece is too cluttered and busy as it is. I feel more artefacts need to be stripped away and removed; so another layer of production might have confused the situation more. This cant be said for sure, but relating things to my SOI, I stated I wanted a minimalist aesthetic, and the piece only just adheres to this ideology as it is. Once I have an exported file, I will be moving things into the ‘Live’ arena to see what potential creative routes are available here. (VJ, live, club and real-time manipulation)
Quartz, in essence enabled me to achieve a reasonable attempt at minimalist a/v so I feel the need for other softwares and inputs may have disrupted clarity and focus. However, a more compartmentalised approach to software and time management may enable me to use others in the future. I will definitely be attempting a live project next year that utilises midi and vj approaches to Video.
In terms of my deliverables from the SOI, I stated I would e handing in Mood-boards, on going analysis’ of conceptual desires and a comprehensive body of research, concept art and several ideas. This target, apart from mood-boards, which I have done, just in an unconventional way. Due to my inspiration being video or moving image, I felt it pointless to make static pictures of pieces as they would be unfaithful to the notion of motion graphics. Therefore, extended research and analysis/deconstruction has been done of the practitioners that inspired the project on my blog. There are multiple excerpts from videos and critiques on my blog that have been used instead.
This did not disrupt the project in anyway, nor did it cloud my vision of what I wanted to create. Although having nice arrangements of minimalist work would have not done any harm to look at, I feel it would have been an unnecessary use of time.
The written analysis and critiques (RATIONAL) of position and concept have really helped me stay in check with both my time management program and enabled more accurate continuity within concept. Constantly referring back to, and criticising my work has helped define narrative from start to finish. I set out to do three pieces at 1000 words each, and have ended up writing 6 pieces, all at over 1500 words. And I feel I need to do more. Mainly explaining and justifying my reasons for doing what I’m doing. This helps clarify concepts and desires in my head, as well as providing a clearer substrate for marking purposes and explaining my ideas.
The storyboards and animatics came in the same workload as the concept art. The concept art acts as ‘concept art,’ art direction and storyboards. These images were then abstracted further. All of which is explained in detail in previous ‘Rationals’ and on the blog.
There are a few facets that were highlighted in the SOI that have not been kept or used. This may have been to the detriment of the project as a whole, however all of the elements that were dropped were for a justified, considered reason. Whether the reasons were correct or not for the product remains to be seen (Grade/marking), but the piece, aesthetically justifies the theory and facets used. The processes or software that I dropped will be returned to in future projects. (Colour theory)(Modul8)…
The statement of intent.
The descriptive title the project was given was ‘ an expression of abstract construction: Minimalist sound,’ and I set myself the challenge of creating a piece of a/v that adhered to a minimalist frame f reference. In the SOI, I wanted to include far to much unnecessary theory much of which was dropped from the start. These elements are mythology, synaesthesia and colour theory. Although colour was used, there wasn’t much ‘colour theory’ involved. I decided to represent the audio with colours that I felt best suited the sound. Therefore any semblance of colour theory, apart from colours that work together was dropped for a more naturally evolving approach. This has been due to the fact that I spent more time creating the audio than expected, and wading through semiotics and the semantic intent of audio ended up taking precedent. I do not feel the piece, or project suffer in any way, however it has left a whole new area of communication open for future endeavours and I will definitely be returning to look at colour in much greater detail.
Wanting to include mythology, in a minimalist piece of a/v was a naïve aspiration also. Nothing within mythology sat with the desired aesthetical results and for this reason was dropped as a line of inquiry early as not to waste time.
The main areas that were explored were semiotics, symbolism, suggestion and phenomenology. As with most of the practitioners I researched, I found myself reducing sounds to their essence and constructing them in a visual form with the most appropriate shape, colour and movement that described its sound. Using shapes and lines, it partially keeps the content open to interpretation, using the bare bones of a conceptual idea, to suggest that these shapes MAY represent these sounds. (To me they do, but others may have a different interpretation) Therefore, wrapped up in this notion is an open dialogue between the artist and the observer, in as much that I have suggested a ‘pallet’ or a motif for the observer to interprete. Highly subjective and certain aspects could be perceived as lazy. However I have tried to use everything within my power to prevent this from being the case, using particle systems, interpolators and generators from within Quartz to really give the audio a visual dynamic that is not lazy, but considered and natural. I feel this is apparent within the piece.
In terms of software, I have utilized most of the proposed items. However I have not used Modul8 and illustrator.
I was planning to use Modul8 in post to manipulate, live or otherwise, an exported Quartz composition. But with the complications in exporting from Quartz, this has had to be sacrificed, as I haven’t had the content to manipulate.
This, although a disappointment, has been a good thing. It enabled me to focus on getting the most out of Quartz, in correlation with my technical ability, but also I feel the piece is too cluttered and busy as it is. I feel more artefacts need to be stripped away and removed; so another layer of production might have confused the situation more. This cant be said for sure, but relating things to my SOI, I stated I wanted a minimalist aesthetic, and the piece only just adheres to this ideology as it is. Once I have an exported file, I will be moving things into the ‘Live’ arena to see what potential creative routes are available here. (VJ, live, club and real-time manipulation)
Quartz, in essence enabled me to achieve a reasonable attempt at minimalist a/v so I feel the need for other softwares and inputs may have disrupted clarity and focus. However, a more compartmentalised approach to software and time management may enable me to use others in the future. I will definitely be attempting a live project next year that utilises midi and vj approaches to Video.
In terms of my deliverables from the SOI, I stated I would e handing in Mood-boards, on going analysis’ of conceptual desires and a comprehensive body of research, concept art and several ideas. This target, apart from mood-boards, which I have done, just in an unconventional way. Due to my inspiration being video or moving image, I felt it pointless to make static pictures of pieces as they would be unfaithful to the notion of motion graphics. Therefore, extended research and analysis/deconstruction has been done of the practitioners that inspired the project on my blog. There are multiple excerpts from videos and critiques on my blog that have been used instead.
This did not disrupt the project in anyway, nor did it cloud my vision of what I wanted to create. Although having nice arrangements of minimalist work would have not done any harm to look at, I feel it would have been an unnecessary use of time.
The written analysis and critiques (RATIONAL) of position and concept have really helped me stay in check with both my time management program and enabled more accurate continuity within concept. Constantly referring back to, and criticising my work has helped define narrative from start to finish. I set out to do three pieces at 1000 words each, and have ended up writing 6 pieces, all at over 1500 words. And I feel I need to do more. Mainly explaining and justifying my reasons for doing what I’m doing. This helps clarify concepts and desires in my head, as well as providing a clearer substrate for marking purposes and explaining my ideas.
The storyboards and animatics came in the same workload as the concept art. The concept art acts as ‘concept art,’ art direction and storyboards. These images were then abstracted further. All of which is explained in detail in previous ‘Rationals’ and on the blog.
There are a few facets that were highlighted in the SOI that have not been kept or used. This may have been to the detriment of the project as a whole, however all of the elements that were dropped were for a justified, considered reason. Whether the reasons were correct or not for the product remains to be seen (Grade/marking), but the piece, aesthetically justifies the theory and facets used. The processes or software that I dropped will be returned to in future projects. (Colour theory)(Modul8)…
Tuesday, 31 May 2011
Filming Screened A/V - Output for hand-in determined
Having used the SoundBooth for the week and having the luxury of hearing and seeing the piece both 'big' and 'loud,' today i filmed my a/v piece off the huge moniter. This was the 'dodge' around the export issues i have been having with Quartz. Filming a screen, using a camera that has the ability to decrease the 'flickering' that happens when screens are filmed from A/v, and sync the audio with the visual in Final Cut.
With Matt Burtons help, this method should have worked fine, and in terms of clarity and detail, it did. However, in the filmed piece, colour balance was off. Everything felt cold and has an offensive blue hue over it all. I have attempted to amend this in After Effects by using the RGB balance and the Gama settings. This helped and it looked far better than it did straight off the camera.
However, because of this, the piece now has a completely different feel to how it does when observed in 'real-time.'
As a result, the output for the piece has been chosen for me. I do not see this as a problem, however fortunate it is. It feels much more right when it is experienced 'real time,' however i would like to write it to disk for hand in and show reel.
Annabeth Mike and Matt Burton have all suggested that this would be an effective method of output, and i feel an installation piece is a fitting way to exhibit this medium of art. It also aligns with whats happening in industry much more, as many practitioners exhibit their work as installation, live environment such as club or festival, and there can be more control in terms of 'site specific' compositions.
I am still determined to figure the export issue out. I just fear it will not be in time for hand in.
So still more work to do - export and formatting, evaluation and Analysis.
For hand in, i will be putting the HQ (bad colour) version in, synced with the audio and colour adjustments from After Effects as a REFERENCE to the LIVE PERFORMANCE that will be done on thursday 2nd with Annabeth and/or Mike and/or Matt smith.
--- UNLESS i manage to obtain decent enough video capture software that doesn't drop as many frames per second. In which case it will handed in as a DVD.
-- Unfortunately, the dates of submission for ONEDOTZERO and Light Night have gone, which would have been ideal events to exhibit at...I am still awaiting a response from Test Space..
Also, Annabeth revealed that an industry contemporary of hers, Jon Harrison ran a company at lovebite.org. that specialised in motion graphics.
I will be getting in touch with him of advice and guidence on places to exhibit and am hoping to ask about INTERNSHIPS. Annabeth suggested this might be a potential.
Excited about new developments and the prospect of Industry opportunities.
I am really enjoying this project and am looking forward to continuing developing my skills over summer and into next year.
The Soundbooth is a hidden gem. It is rarely used, full recording studio set up with double moniters, mac pro, firewire with 8 inputs, great speakers and fully soundproofed.
I will be using this newly found resource to its full potential in my final year. It is a huge asset that will assist productivity and workflow immensely. The advice of tutors and the a/v department are a great source of knowledge and have helped my project reach an optimum conclusion.
With Matt Burtons help, this method should have worked fine, and in terms of clarity and detail, it did. However, in the filmed piece, colour balance was off. Everything felt cold and has an offensive blue hue over it all. I have attempted to amend this in After Effects by using the RGB balance and the Gama settings. This helped and it looked far better than it did straight off the camera.
However, because of this, the piece now has a completely different feel to how it does when observed in 'real-time.'
As a result, the output for the piece has been chosen for me. I do not see this as a problem, however fortunate it is. It feels much more right when it is experienced 'real time,' however i would like to write it to disk for hand in and show reel.
Annabeth Mike and Matt Burton have all suggested that this would be an effective method of output, and i feel an installation piece is a fitting way to exhibit this medium of art. It also aligns with whats happening in industry much more, as many practitioners exhibit their work as installation, live environment such as club or festival, and there can be more control in terms of 'site specific' compositions.
I am still determined to figure the export issue out. I just fear it will not be in time for hand in.
So still more work to do - export and formatting, evaluation and Analysis.
For hand in, i will be putting the HQ (bad colour) version in, synced with the audio and colour adjustments from After Effects as a REFERENCE to the LIVE PERFORMANCE that will be done on thursday 2nd with Annabeth and/or Mike and/or Matt smith.
--- UNLESS i manage to obtain decent enough video capture software that doesn't drop as many frames per second. In which case it will handed in as a DVD.
-- Unfortunately, the dates of submission for ONEDOTZERO and Light Night have gone, which would have been ideal events to exhibit at...I am still awaiting a response from Test Space..
Also, Annabeth revealed that an industry contemporary of hers, Jon Harrison ran a company at lovebite.org. that specialised in motion graphics.
I will be getting in touch with him of advice and guidence on places to exhibit and am hoping to ask about INTERNSHIPS. Annabeth suggested this might be a potential.
Excited about new developments and the prospect of Industry opportunities.
I am really enjoying this project and am looking forward to continuing developing my skills over summer and into next year.
The Soundbooth is a hidden gem. It is rarely used, full recording studio set up with double moniters, mac pro, firewire with 8 inputs, great speakers and fully soundproofed.
I will be using this newly found resource to its full potential in my final year. It is a huge asset that will assist productivity and workflow immensely. The advice of tutors and the a/v department are a great source of knowledge and have helped my project reach an optimum conclusion.
Monday, 30 May 2011
Amendments after Crit and Poor res Filmed version of Final Piece
Amendments made - Analysis and Critique
The issues that were described in 'Rational 5,' have now been implemented. There is a lot of detail lost on these low res posts for blogging. Its frustrating. A Better quality version, (still filmed on a digital happy snap camera and a laptop screen) will be handed in on disk for evidence, however due to all the rendering issues, there is only likely to be one HQ version available, that will be handed in as my final piece.
The changes that have been made are altered bass graphic, masking and use of layers to add dynamic, there has been one more element added. (Scrape (sound) - Lines (graphic, yellow + purple)). This allows the use of negative space in the 'quite' sections or drops within the audio, without allowing the screen to ever be just black. The audio at several points drops away to a solo scrape sound. The visuals accurately depict this.
The masking has been a great feature added. I have used sprites, usually rectangles, coupled with an Interpolation patch and made them black so they are indistinguishable from the background. Using the Interpolation patch, animated them so they move, synced with the kick drum, back and forthe across the composition. Arranging them on the top layers means that they they 'interfere' and or mask the artefacts on the layers below. This has broken up the lines and added so much dynamic. It prevents anything in the composition from feeling static. One of the main criticisms discussed at crit.
The dot effect that represents the snare pattern, has been turned into a 4x4 grid, within which, in the first half of the audio, flashes randomly around its 16 different positions in accordance with the snare hits.
This is then formalized in the second half of the piece, (as the audio is also regulated and formalized) into following a set pattern.
I still have no answer as to how to move the camera. I had a reply off a form that suggested using Kinemes' GL Tools patch. Having downloaded this patch, i have no i dea how to make it work. I have left another post on the same thread asking for further help and more detail, and am awaiting a response.
Having exterior input and suggestions has enabled me to make the piece far more sophisticated than it was before. Utilising ideas and lateral thinking that would otherwise not have been available to me to create something far more engaging. Masking, better more balanced composition, less cluttered, far more dynamic and less static.
I feel using 3d space, and being able to move the camera position (controlled via Midi for accuracy) it would make for really exciting visuals and a far more 'industry standard' level of work.
I will figure it out. Just perhaps not for deadline unfortunately.
Saturday, 28 May 2011
Rational 5 / Crit and Final week
In making a final piece, the main objectives that differed from the tests were compositional values and the specifics of shape and form.
In choosing Quartz as the correct software for the task, there has been an exponential learning curve.
Its taken weeks of reading forum posts, watching tutorials and incredible amounts of experimentation to achieve a result. As proud of these achievements as i am, the room for growth is infinite. Crit was a perfect chance to receive some feedback, on the work as a whole, but specifics in things i hadn't even considered.
-Choice of colours
-The placid and lazy approach in illustrating certain aspects. (Bass, movement and dynamic)
-New and old practitioners that i hadn't heard of
-methods to output
The piece was received well, however the fixed nature of the viewer position or lack of movement within the artefacts was commented on. This has been a concern in the realisation of the idea, ever since beginning work in Quartz, and is definitely something that needs attention and a response to, especially now this has been confirmed by peers.
My initial response to this has been to think about shifting the viewer or camera angle, via an expressive MIDI output, to add another dimension. Now if the facility was available directly i would have liked to have implemented this effect already. However, after hours of searching and asking on forums, emails to apple developers and Kineme.net advisors, there has been no response. There are no patches to achieve this effect that i have found to date.
I have seen this effect used in the practitioners that i have researched so i know it is doable, however they must, at present, either be using custom patches or different software. (Max??Jitter??)
Another potential answer to this is to go back into the composition, (which i will be doing to re assess the illustrated bass) and change some of the parameters that effect the axis. Artefacts could be manipulated, especially along the z axis could be used to create a better sense of depth.
The same could be said for the direction of certain artefacts.
The bass colour and illustration needs some work to. This was also commented on.
Im not sure how to take this. However Jon suggested putting some black artefacts infront to block or mask some of the colour and the 'screensaver' vibe about the line families in their current state.
I will use this technique.
I will also try effecting some of the parameters of the line families to try to dissolve this screensaver element as this is the last thing i would like the piece to be remembered for.
(making a good screensaver)
The last issue from crit that needs addressing and the my main concern from making the a/v has been its output. Quartz not having an easy way of exporting compositions, after crit, it was discussed that using the sound booths at college, an av lead and adaptor, the big hi res monitor and a decent camera, the screen could be filmed and edited with the audio. This way, the issues of lost frames with video capture software and quicktime export fails are out the way. It was also discussed that if all else failed, i could do a 'performance' to tutors and that could be recorded for reference and this would be sufficient for hand in. (I want to be able to export properly in the future)
In making the a/v there were a few different artefacts and patches used to the midi tests and experimentation done prior to starting. I.e. Using complex arrangements of patches to create particle systems that have depth and movement, controlled via accurate animation of its parameters from within the patch-inspector-window. Previously, my understanding was so limited, the output had no element of control or creative expression. However, having had slightly more time with the software, and gaining a better understanding of the dexterity and capabilities of the 'patches', i have been able to gain slightly more control and accuracy in execution.
I began using line families, and particle systems and using input patches such as interpolation and random generators to further emphasise aspects of the audio.
This enables for example, the particle system that represents the hi-hat pattern, can now have the element of the flange effect over the patter (in and out), to also be emphasised. Hopefully amending elements that were pointed out in crit. (At least to some degree)
The composition evolution that has occurred throughout this project has had to have remained flexible due to having to learn new software. As this can is an excuse for poor execution, i didn't want to allow this to enter into the equation. And, although subjective, i dont think i have. I am so happy with the amount of new things i have learned and the potential for future growth.
Initially i wanted the actual arrangements of lines to be like the wave forms conceived in the Concept art and Art direction. This had to be altered as soon as i knew what i was up against in using Quartz. These would be possible, however, with my level of dexterity with the software, it would have ben far to time consuming to accomplish the effects i desired. They could be loosely modelled and replicated via the use if 'line families' from within Quartz's library. These are easier to animate and have far more parameters to control than just using 'lines', 'billboards' or 'sprites,' that are limiting.
I really liked the patch when i got to grips with it and plan to use it more. However, as pointed out in crit, they have a tendency to look like screen savers. I feel they may need to be abstracted slightly. Less lines involved and more clever use of the axis (z), to try to add a greater sense of depth. (Dynamic)
I feel the pieces evolution has been natural, in response to creative challenges that have arisen during the course of the project. It has moved along so far since the Statement of Intent. I feel these changes have ben justified and clearly examined during the course of analysis.
Ideas like synesthesezia went out the window early on, as my intentions swung towards accurately expressing sound in a visual format. My understanding of my intentions expanded as the project grew.
The initial foundations from my SOI are still there, however the details have been abstracted, the same as the details in the visuals, hopefully keeping a continuity within the concept initially conceived. Minimalism and defined rules that lie at the root of the proposed expression.
I am still perplexed with issues of minimalism and reductionism. I have reached the conclusion that to attempt any piece of minimalist work, there has to be an element or reductionism. A minimal aesthetic cannot be conceived without this process. Whether mental and philosophical, or physical and finite, some element of stripping away. EG Inherent within the question "How do i express the essence of an object, artefact, sound or judgement in a visual form?" Requires the artist to think of the object, and strip it of its shell and semiotic meaning. Thus already shredding and reducing an object in thought. This is the conclusion i have drawn from my investigation into minimalism and my attempts to use its ideology faithfully.
These new ideas will be implemented and evaluated.
In choosing Quartz as the correct software for the task, there has been an exponential learning curve.
Its taken weeks of reading forum posts, watching tutorials and incredible amounts of experimentation to achieve a result. As proud of these achievements as i am, the room for growth is infinite. Crit was a perfect chance to receive some feedback, on the work as a whole, but specifics in things i hadn't even considered.
-Choice of colours
-The placid and lazy approach in illustrating certain aspects. (Bass, movement and dynamic)
-New and old practitioners that i hadn't heard of
-methods to output
The piece was received well, however the fixed nature of the viewer position or lack of movement within the artefacts was commented on. This has been a concern in the realisation of the idea, ever since beginning work in Quartz, and is definitely something that needs attention and a response to, especially now this has been confirmed by peers.
My initial response to this has been to think about shifting the viewer or camera angle, via an expressive MIDI output, to add another dimension. Now if the facility was available directly i would have liked to have implemented this effect already. However, after hours of searching and asking on forums, emails to apple developers and Kineme.net advisors, there has been no response. There are no patches to achieve this effect that i have found to date.
I have seen this effect used in the practitioners that i have researched so i know it is doable, however they must, at present, either be using custom patches or different software. (Max??Jitter??)
Another potential answer to this is to go back into the composition, (which i will be doing to re assess the illustrated bass) and change some of the parameters that effect the axis. Artefacts could be manipulated, especially along the z axis could be used to create a better sense of depth.
The same could be said for the direction of certain artefacts.
The bass colour and illustration needs some work to. This was also commented on.
Im not sure how to take this. However Jon suggested putting some black artefacts infront to block or mask some of the colour and the 'screensaver' vibe about the line families in their current state.
I will use this technique.
I will also try effecting some of the parameters of the line families to try to dissolve this screensaver element as this is the last thing i would like the piece to be remembered for.
(making a good screensaver)
The last issue from crit that needs addressing and the my main concern from making the a/v has been its output. Quartz not having an easy way of exporting compositions, after crit, it was discussed that using the sound booths at college, an av lead and adaptor, the big hi res monitor and a decent camera, the screen could be filmed and edited with the audio. This way, the issues of lost frames with video capture software and quicktime export fails are out the way. It was also discussed that if all else failed, i could do a 'performance' to tutors and that could be recorded for reference and this would be sufficient for hand in. (I want to be able to export properly in the future)
In making the a/v there were a few different artefacts and patches used to the midi tests and experimentation done prior to starting. I.e. Using complex arrangements of patches to create particle systems that have depth and movement, controlled via accurate animation of its parameters from within the patch-inspector-window. Previously, my understanding was so limited, the output had no element of control or creative expression. However, having had slightly more time with the software, and gaining a better understanding of the dexterity and capabilities of the 'patches', i have been able to gain slightly more control and accuracy in execution.
I began using line families, and particle systems and using input patches such as interpolation and random generators to further emphasise aspects of the audio.
This enables for example, the particle system that represents the hi-hat pattern, can now have the element of the flange effect over the patter (in and out), to also be emphasised. Hopefully amending elements that were pointed out in crit. (At least to some degree)
The composition evolution that has occurred throughout this project has had to have remained flexible due to having to learn new software. As this can is an excuse for poor execution, i didn't want to allow this to enter into the equation. And, although subjective, i dont think i have. I am so happy with the amount of new things i have learned and the potential for future growth.
Initially i wanted the actual arrangements of lines to be like the wave forms conceived in the Concept art and Art direction. This had to be altered as soon as i knew what i was up against in using Quartz. These would be possible, however, with my level of dexterity with the software, it would have ben far to time consuming to accomplish the effects i desired. They could be loosely modelled and replicated via the use if 'line families' from within Quartz's library. These are easier to animate and have far more parameters to control than just using 'lines', 'billboards' or 'sprites,' that are limiting.
I really liked the patch when i got to grips with it and plan to use it more. However, as pointed out in crit, they have a tendency to look like screen savers. I feel they may need to be abstracted slightly. Less lines involved and more clever use of the axis (z), to try to add a greater sense of depth. (Dynamic)
I feel the pieces evolution has been natural, in response to creative challenges that have arisen during the course of the project. It has moved along so far since the Statement of Intent. I feel these changes have ben justified and clearly examined during the course of analysis.
Ideas like synesthesezia went out the window early on, as my intentions swung towards accurately expressing sound in a visual format. My understanding of my intentions expanded as the project grew.
The initial foundations from my SOI are still there, however the details have been abstracted, the same as the details in the visuals, hopefully keeping a continuity within the concept initially conceived. Minimalism and defined rules that lie at the root of the proposed expression.
I am still perplexed with issues of minimalism and reductionism. I have reached the conclusion that to attempt any piece of minimalist work, there has to be an element or reductionism. A minimal aesthetic cannot be conceived without this process. Whether mental and philosophical, or physical and finite, some element of stripping away. EG Inherent within the question "How do i express the essence of an object, artefact, sound or judgement in a visual form?" Requires the artist to think of the object, and strip it of its shell and semiotic meaning. Thus already shredding and reducing an object in thought. This is the conclusion i have drawn from my investigation into minimalism and my attempts to use its ideology faithfully.
These new ideas will be implemented and evaluated.
Thursday, 26 May 2011
Facets of Quartz
Quartz Composer has many similarities to Max/MSP or Vvvv although its primary usage is for graphical rather than audio processing. The ability to construct interactive video compositions that react to audio or MIDI signals but which can be played from any QuickTime-aware application has caused a great deal of interest in Quartz Composer from VJs.
PATCHES
Quartz Composer works by connecting patches. Patches are the base processing units. They execute and produce a result. For better performance, patch execution follows a lazy evaluation approach, meaning that patches are only executed when their output is needed. There are three types of patches: Consumers, Processors, and External Input patches that can receive and output mouse clicks, scrolls, and movements; MIDI and audio; keyboard; or other movements. A collection of patches can be melded into one, called a macro. Macros can be nested and their subroutines also edited.
COMPOSITION
Patches, their connections, and their input port states are saved in the composition file. Images can be stored inside a composition as well, making for self-contained compositions with embedded graphics. By dragging a movie file into the Quartz Composer editor, a reference to the movie file is created, providing a changing image that can be connected to a renderer.
Compositions also store metadata such as composition author, copyright, and description.
IMAGE FILES RECOGNISED: PEG, JPEG2000, GIF, PNG, TIFF, TGA, OpenEXR, BMP, ICO, PDF, PICT, ICNS
Support for some Automator actions were added with the release of Leopard.
-Apply Quartz Composition Filter to Image Files
-Convert Quartz Compositions to QuickTime Movies
-Render Quartz Compositions to Image Files
PATCHES
Quartz Composer works by connecting patches. Patches are the base processing units. They execute and produce a result. For better performance, patch execution follows a lazy evaluation approach, meaning that patches are only executed when their output is needed. There are three types of patches: Consumers, Processors, and External Input patches that can receive and output mouse clicks, scrolls, and movements; MIDI and audio; keyboard; or other movements. A collection of patches can be melded into one, called a macro. Macros can be nested and their subroutines also edited.
COMPOSITION
Patches, their connections, and their input port states are saved in the composition file. Images can be stored inside a composition as well, making for self-contained compositions with embedded graphics. By dragging a movie file into the Quartz Composer editor, a reference to the movie file is created, providing a changing image that can be connected to a renderer.
Compositions also store metadata such as composition author, copyright, and description.
IMAGE FILES RECOGNISED: PEG, JPEG2000, GIF, PNG, TIFF, TGA, OpenEXR, BMP, ICO, PDF, PICT, ICNS
Support for some Automator actions were added with the release of Leopard.
-Apply Quartz Composition Filter to Image Files
-Convert Quartz Compositions to QuickTime Movies
-Render Quartz Compositions to Image Files
Thursday, 19 May 2011
Another Really useful Quartz tutorial
Line families will be used a lot in the construction of my piece. This tutorial has helped further emphasise and exaggerate the capabilities of these patches and sections will definitely be trailed in my own quartz tests..
Rational 4: Conceptual shifts, Software, New delivery thoughts.
From creating concept art, i thought i had a clear idea of how the final piece would look. This turned out to be far more of a direction hint that has evolved exponentially since beginning to use Quartz. I have found that the software, my ability and technical skills have affected what is going to be possible far more than i imagined. Instead of letting this become a hindrance, it has been taken as a creative challenge and something that can be worked around.
It was stated that the concept art (lines) would represent wave forms, which they do, and it was also written in that these images would be abstracted one stage further than they are. For suggestive purposes and aesthetic potentials. This has turned out to be the case. However, through software and technical implications rather than personal endeavor.
In Quartz, there are patches that i have used for the line effects are called 'line families.' These look similar, but are not the same as the image. For the effect i wanted, it would have been an impossible feat with my limited technical knowledge of the software, to manipulate or replicate these images in Quartz. The parameters available for the manipulation of line families far out way that of the image capabilities and are easier to understand and see in a visual form.
Also, i feel i have found a place for, and a way of introducing colour into the composition.
I wanted this to come from a natural place, so i have been listening to the different sounds and patterns, and imagining the colours and feelings these sounds evoke. I have come to several conclusions that will be discussed at a later date, that have all come from a reoccurring mental and visual theme relevant to each specific sound. I feel this has really helped me develop the shape of the sound (in a visual sense) and the colour to help evoke an emotional response to the audio.
The audio has changed slightly. I felt that 3 minutes was too long and some of the emphasis and dynamic was lost with the repetitive nature of loop based music. It has essentially been reduced one stage further. Dropping one section in the second half and some of the intro.
Introducing the sounds one by one in the piece is very important. It enables the observer to easily attach the shape or animation to an element of sound, before all the elements are reacting together. I Hope this will further develop the observers attachment to the piece by 'inviting' them into the piece, hopefully adding an interactive element. This i feel makes up for the loss of the origional suggestive element, and adds different form of suggestion.
This is the main evolution so far having started creating the piece.
The aesthetic of the piece has also evolved somewhat and i have been more than happy to accommodate these evolutions, somewhat out of necessity, but also simply to enhance and further the learning curve i am currently on with learning a new language and software formalities. I am really enjoying what has been coming out of Quartz. It has reminded me that this method of creation is not beyond my reach and i would like to investigate Max msp and other node based visual programming languages in the future. A whole new range of possibilities have become available through this exercise.
This also includes having looked at other companies, competitions and festivals, indicating a thriving scene for this sort of work. Industry is looking at least .5% less intimidating than it did 6 months ago.
I feel i have contextualised and given meaning through further research to why the audio is expressed in the manner it is...
Kontact and Minus records, all using Quartz to display visuals at live performances, that have validated using this software (in terms of trends and industry), as well as wanting to understand more about this side of programming. It is leading me down a path that i am throughly enjoying observing and participating.
The same goes for the aesthetic of the piece. It is inspired by minimal artists, both contemporary and classic, Peter Peri - Sol LeWitt, Through to the current trends in minimal visual and audio in Richie Hawtin, Phadphinderei, Quayola, Ryoji Ikeda and the others mentioned in prior research. Indicating that the concepts and aesthetics present in my work are in line with whats going on in industry.
Moving onto primary research in this area:
Having emailed Tests Space asking for Suggestions on how to exhibit work and on potential involvement, i await a response to see what industry ideas and suggestions have to offer. I will be going To Test Space to talk personally with them in the near future about potential avenues of interest.
I will be entering the Onedotzero adventures in motion festival with this piece. Deadline 29 may.
In looking at forums and trying to discover answers to my exporting issue, it appears that when using midi, it is unknown how to export a correct and working QT movie from qtz.
http://kineme.net/forum/Discussion/DevelopingCompositions/ExportQCclipmiditriggerAbleton
There are dodges round it but the actuality is that there appears to be no way of doing it...
However, it has been suggested that if the video is sent out from qtz, to a video in and recorded, it may work. .. This lies beyond my capability here, but the resources and staff at college should be able to advise my direction...
If this does not work out, then there may have to be a re think on the delivery and execution of the piece. Maybe into a live application (Installation) Implications???
It was stated that the concept art (lines) would represent wave forms, which they do, and it was also written in that these images would be abstracted one stage further than they are. For suggestive purposes and aesthetic potentials. This has turned out to be the case. However, through software and technical implications rather than personal endeavor.
In Quartz, there are patches that i have used for the line effects are called 'line families.' These look similar, but are not the same as the image. For the effect i wanted, it would have been an impossible feat with my limited technical knowledge of the software, to manipulate or replicate these images in Quartz. The parameters available for the manipulation of line families far out way that of the image capabilities and are easier to understand and see in a visual form.
Also, i feel i have found a place for, and a way of introducing colour into the composition.
I wanted this to come from a natural place, so i have been listening to the different sounds and patterns, and imagining the colours and feelings these sounds evoke. I have come to several conclusions that will be discussed at a later date, that have all come from a reoccurring mental and visual theme relevant to each specific sound. I feel this has really helped me develop the shape of the sound (in a visual sense) and the colour to help evoke an emotional response to the audio.
The audio has changed slightly. I felt that 3 minutes was too long and some of the emphasis and dynamic was lost with the repetitive nature of loop based music. It has essentially been reduced one stage further. Dropping one section in the second half and some of the intro.
Introducing the sounds one by one in the piece is very important. It enables the observer to easily attach the shape or animation to an element of sound, before all the elements are reacting together. I Hope this will further develop the observers attachment to the piece by 'inviting' them into the piece, hopefully adding an interactive element. This i feel makes up for the loss of the origional suggestive element, and adds different form of suggestion.
This is the main evolution so far having started creating the piece.
The aesthetic of the piece has also evolved somewhat and i have been more than happy to accommodate these evolutions, somewhat out of necessity, but also simply to enhance and further the learning curve i am currently on with learning a new language and software formalities. I am really enjoying what has been coming out of Quartz. It has reminded me that this method of creation is not beyond my reach and i would like to investigate Max msp and other node based visual programming languages in the future. A whole new range of possibilities have become available through this exercise.
This also includes having looked at other companies, competitions and festivals, indicating a thriving scene for this sort of work. Industry is looking at least .5% less intimidating than it did 6 months ago.
I feel i have contextualised and given meaning through further research to why the audio is expressed in the manner it is...
Kontact and Minus records, all using Quartz to display visuals at live performances, that have validated using this software (in terms of trends and industry), as well as wanting to understand more about this side of programming. It is leading me down a path that i am throughly enjoying observing and participating.
The same goes for the aesthetic of the piece. It is inspired by minimal artists, both contemporary and classic, Peter Peri - Sol LeWitt, Through to the current trends in minimal visual and audio in Richie Hawtin, Phadphinderei, Quayola, Ryoji Ikeda and the others mentioned in prior research. Indicating that the concepts and aesthetics present in my work are in line with whats going on in industry.
Moving onto primary research in this area:
Having emailed Tests Space asking for Suggestions on how to exhibit work and on potential involvement, i await a response to see what industry ideas and suggestions have to offer. I will be going To Test Space to talk personally with them in the near future about potential avenues of interest.
I will be entering the Onedotzero adventures in motion festival with this piece. Deadline 29 may.
In looking at forums and trying to discover answers to my exporting issue, it appears that when using midi, it is unknown how to export a correct and working QT movie from qtz.
http://kineme.net/forum/Discussion/DevelopingCompositions/ExportQCclipmiditriggerAbleton
There are dodges round it but the actuality is that there appears to be no way of doing it...
However, it has been suggested that if the video is sent out from qtz, to a video in and recorded, it may work. .. This lies beyond my capability here, but the resources and staff at college should be able to advise my direction...
If this does not work out, then there may have to be a re think on the delivery and execution of the piece. Maybe into a live application (Installation) Implications???
Rendering an export issues - Talking to James (quartz and Max engineer)
Over the course of using quartz and extensive researching on forums, it has come to my attention that there are many issues surrounding exporting and rendering animations from quartz.
1) Many people, including James, have mentioned that since the upgrade to Quicktime x, there have been a lot of features dropped from previous additions, features that .qtz files need in order to operate.
So i got a copy of QT7 and followed multiple methods, trying to get my animation exported.
I have only been able to achieve a MIDI unrelated section of moving graphics that lasts 30 seconds, regardless of how the 'time of animation' is set. (If set to 4 hours, it would still only render 30 seconds.)
2) Talking to James in a/v, he suggested a few dodges, like rendering a blank video clip that lasts the length of the proposed animation to allow the Quartz export clock to recognise some time defined parameters. This was to no avail, tried on my machine and his and both failing miserably.
3) With more deliberation and after trying the same method with audio clips, there was one more thing he could think of trying. And that was to use screen capture software.
However, with research into this and trial and error, none of the results are good enough to display for a final piece. Too many frames per second are dropped, so the animation stutters and jolts, destroying any semblance of fluidity and destroying the effect of MIDI related visual.
I tried, i Capture, Adobes Captivate, Quicktime movie capture, as well as a few more free ones. However i dont have the money to spend on HD quality video capture software.. Even if one good enough exists....
James and I ame to the conclusion that it was because Quartz is operating independently of the signal/duration of the audio being sent from live, not defined within quartz...
The only way round this issue that i can see at the minute is to construct the quartz piece, then make use of the HD camera resource ad sound booths at college. Film a monitor displaying the visual (stil sending the audio from Live) using the HD camera, then sync the audio up with the visual in Final cut. I feel this may be the only dodge left to use until i ahve the resources to purchase good vid cap software or find a method for exporting MIDI orientated visual from quartz.
There might be another option of sending the video out to another monitor or computer that has a video in. This needs to be investigated as ther eis the potential of using the sound labs in uni to accomplish what i need to do.
This will be resolved before monday!
1) Many people, including James, have mentioned that since the upgrade to Quicktime x, there have been a lot of features dropped from previous additions, features that .qtz files need in order to operate.
So i got a copy of QT7 and followed multiple methods, trying to get my animation exported.
I have only been able to achieve a MIDI unrelated section of moving graphics that lasts 30 seconds, regardless of how the 'time of animation' is set. (If set to 4 hours, it would still only render 30 seconds.)
2) Talking to James in a/v, he suggested a few dodges, like rendering a blank video clip that lasts the length of the proposed animation to allow the Quartz export clock to recognise some time defined parameters. This was to no avail, tried on my machine and his and both failing miserably.
3) With more deliberation and after trying the same method with audio clips, there was one more thing he could think of trying. And that was to use screen capture software.
However, with research into this and trial and error, none of the results are good enough to display for a final piece. Too many frames per second are dropped, so the animation stutters and jolts, destroying any semblance of fluidity and destroying the effect of MIDI related visual.
I tried, i Capture, Adobes Captivate, Quicktime movie capture, as well as a few more free ones. However i dont have the money to spend on HD quality video capture software.. Even if one good enough exists....
James and I ame to the conclusion that it was because Quartz is operating independently of the signal/duration of the audio being sent from live, not defined within quartz...
The only way round this issue that i can see at the minute is to construct the quartz piece, then make use of the HD camera resource ad sound booths at college. Film a monitor displaying the visual (stil sending the audio from Live) using the HD camera, then sync the audio up with the visual in Final cut. I feel this may be the only dodge left to use until i ahve the resources to purchase good vid cap software or find a method for exporting MIDI orientated visual from quartz.
There might be another option of sending the video out to another monitor or computer that has a video in. This needs to be investigated as ther eis the potential of using the sound labs in uni to accomplish what i need to do.
This will be resolved before monday!
Wednesday, 18 May 2011
Second Quartz Test - All elements of the track Reacting to MIDI
Unfortunately blogger isn't allowing the upload of videos at the moment so i will have to use a screen grab to illustrate what was in the viewer window. The videos of the tests will be handed in on disk.
Describing the linked nodes used to create the output achieved.
PATCHES:
It has been hard to get grabs of specific aspects of the composition test. The colours are faint and its unclear in sections to see whats what. ....
This is the kick drum, represented as a 'line family.' The first thing was to adjust the x and y postion. Purely aesthetical at present. The next thing was to adjust the orientation of the family, again defined by x,y, and z parameters. . Once happy, i used a generator and an interpolator, chained to the start and finish point of the z and y coordinates to produce a shift and fold in the lines from left to right. Do add dynamics, i also chained the note receiver to the 'enable' modulation parameter. This in turn makes the animated line family come on and off for the 'action' of the kick drum.
These were then condensed into a macro and named as KICK.
This illustrates the cymbal pattern and part of the snare, The snare will be discussed later. The cymbal pattern has the most dynamic rhythmical influence on the track and needed to be represented in a fluid motion. I also wanted to experiment more with particle systems as i have some plugins specifically for the manipulation of PS that i will be trying soon. So again, out of live on a specific channel in to a newly set up MIDI notes receiver, and routed to a different Channel. This was then chained to a particle system, of limited parameters, however it had gravity responsive interactivity between particles which found interesting. After manipulating the size and positioning of the particles in terms of composition, i then decided to experiment with differing alpha channels and layers of colour. This yielded some results, none of which however i feel will be taken forward into the final piece. (COLOURS) the particle systems will definitely be developed further.
This relates to the snare. (woodblocks) These to me felt fractionised. Ie because of how much musical space surrounds them, i felt this need to be exploited and represented faithfully in a visual format. There is a Hit, a y,z axis manipulated by a generator of start and finish ponts so it swings across from out of vision bottom right into frame as a diminishing curve to nothing ( -z axis) It also flashes with a line (middle fight) this is to give the hit a 'lifeline' to the composition in the space where it is not audible or visible.
There is another layer of rhythm in this image. To the left and bottom there are lines that represent a reduced, static sounding rhythm part. This was a bad idea in the compositional of this test. I want to recreate that classic sci- fi x / y axis scroll over data. Thinking Blade Runner computer in Deckards apartment. The Ryoji Ikeda piece 'Formula,' demonstrates similar qualities. I would like this to be on and off in rhythmical time, as well as a horizontal line moving from top to bottom, and a vertical line that travels from left to right. . . . (Formula research on blog about Ryoji Ikeda for details of above description.
This line family represents the base line. On and off in time with the duration of the Midi hit, and extending in length defined by the same parameters. This created a brilliant effect. Positioned along the z axis, it was brought to beyond the camera so it was only fractionally see. The whole line family was huge, this was just the manipulated parts that i wanted visible. Again the colours are much brighter 'realtime' and span right to left in accordance with the intervals, lengths and varying patterns of the Bass track (MIDI dummy bass track). These are also velocity assignable.
A great second test. Really enjoying geting to grips with this programming language. Having downloaded plugins and patches from kineme.com, i feel the next attempt will be exponentially better.
After hours and Hours of learning, tutorials, MIDI forms and Live discussions about sending Midi, finally the foundations of the language are in place.
Describing the linked nodes used to create the output achieved.
PATCHES:
Patches are the principle method of applying graphics, animation and outputting data in the quartz editor window. To keep things tidy, there is a really handy macro and parenting system. This is such a handy reference tool if everything is named properly... The Process begins as in the post about Midi test one, A MIDI notes receiver is the first patch required, that is set up to receive MIDI signals sent from Live, out to the AIC Driver (language Quartz can recognise) on a bus. Each Layer of track in Live is told to send information over a specific channel that is then coordinated in the settings of the MIDI notes receiver in Quartz . It is then just a case of defining the scale of octaves required (for some reason quartz thinks that the notes are 2 octaves higher than they are... ie a midi hit of C3 from Live, would be received in quartz as C5.. something that took hours to figure out. After reading on forums and mac developer sites, it appears to be a bug with the mac osx 64 bit snow leopard version of Quartz. This is also explained in one of the MIDI youtube tutorials. Once the environment is set up, there are then a series of modulation and generator patches that enable you to apply graphics and manipulate them, according to the MIDI signals, clock, or envelopes applied in LIVE and MIDI nottes receiver in Quartz. Lines, cubes, gradients, animate and defineing parameters of these, and a full x,y,z, axis system and access to a huge range of colours. Also there are multiple free plugins that allowing for even more dynamics.
It has been hard to get grabs of specific aspects of the composition test. The colours are faint and its unclear in sections to see whats what. ....
This is the kick drum, represented as a 'line family.' The first thing was to adjust the x and y postion. Purely aesthetical at present. The next thing was to adjust the orientation of the family, again defined by x,y, and z parameters. . Once happy, i used a generator and an interpolator, chained to the start and finish point of the z and y coordinates to produce a shift and fold in the lines from left to right. Do add dynamics, i also chained the note receiver to the 'enable' modulation parameter. This in turn makes the animated line family come on and off for the 'action' of the kick drum.
These were then condensed into a macro and named as KICK.
This illustrates the cymbal pattern and part of the snare, The snare will be discussed later. The cymbal pattern has the most dynamic rhythmical influence on the track and needed to be represented in a fluid motion. I also wanted to experiment more with particle systems as i have some plugins specifically for the manipulation of PS that i will be trying soon. So again, out of live on a specific channel in to a newly set up MIDI notes receiver, and routed to a different Channel. This was then chained to a particle system, of limited parameters, however it had gravity responsive interactivity between particles which found interesting. After manipulating the size and positioning of the particles in terms of composition, i then decided to experiment with differing alpha channels and layers of colour. This yielded some results, none of which however i feel will be taken forward into the final piece. (COLOURS) the particle systems will definitely be developed further.
This relates to the snare. (woodblocks) These to me felt fractionised. Ie because of how much musical space surrounds them, i felt this need to be exploited and represented faithfully in a visual format. There is a Hit, a y,z axis manipulated by a generator of start and finish ponts so it swings across from out of vision bottom right into frame as a diminishing curve to nothing ( -z axis) It also flashes with a line (middle fight) this is to give the hit a 'lifeline' to the composition in the space where it is not audible or visible.
There is another layer of rhythm in this image. To the left and bottom there are lines that represent a reduced, static sounding rhythm part. This was a bad idea in the compositional of this test. I want to recreate that classic sci- fi x / y axis scroll over data. Thinking Blade Runner computer in Deckards apartment. The Ryoji Ikeda piece 'Formula,' demonstrates similar qualities. I would like this to be on and off in rhythmical time, as well as a horizontal line moving from top to bottom, and a vertical line that travels from left to right. . . . (Formula research on blog about Ryoji Ikeda for details of above description.
This line family represents the base line. On and off in time with the duration of the Midi hit, and extending in length defined by the same parameters. This created a brilliant effect. Positioned along the z axis, it was brought to beyond the camera so it was only fractionally see. The whole line family was huge, this was just the manipulated parts that i wanted visible. Again the colours are much brighter 'realtime' and span right to left in accordance with the intervals, lengths and varying patterns of the Bass track (MIDI dummy bass track). These are also velocity assignable.
A great second test. Really enjoying geting to grips with this programming language. Having downloaded plugins and patches from kineme.com, i feel the next attempt will be exponentially better.
After hours and Hours of learning, tutorials, MIDI forms and Live discussions about sending Midi, finally the foundations of the language are in place.
First Quartz MIDI test - developments and issues
This was the first time I opened Quartz. A relatively simple interface to get to grips with. An editor window with a viewer, a library and a patch inspector like most Apple softwares. It comes with patches and plug ins, but after extensive research on forums, the general consensus seemed to be that kineme.com had many powerful free ones for download that allow extra parameters and functionality so i got some of those to test out at a later date.
Before beginning to think about the audio and Live, i needed to adjust a setting in the Audio MIDI preferences. This had to do with the IAC driver that is required to send midi out on a bus, and have it recognised in Quartz with via that bus and a MIDI notes receiver.
The full Ableton File of the track and its construction was then brought into Live.. First of all, there were a couple of the channels were audio files. Due to it being the MIDI signal that is required, these had to be converted. This was done by a function in live 8 that allows you to slice an audio track to a new MIDI track.
However. This put the different frequencies/splices of the audio onto different MIDI notes in the scale. Thus making it a mammoth task in quartz to join the right aspects of the receiver to the signal its receiving as there are so many notes..
After a bit of thinking, i decided that the best course of action would be to create a dummy MIDI track for each of the audio clips, and copy the pattern in to a MIDI track, without assigning it a sound. This meant that the midi could be sent out to quartz, in simple patterns that correspond to the wave forms, on a single note. Thus, when receiving the signal in Quartz, there is only one, patterned MIDI note to deal with for each Instrument in live. (Comprised of multiple sounds and notes)
This took me a while to get round but it has turned out to work perfectly. Not the most elagent solution but one that works none the less.
So with different components on separate 'sends' or channels, being sent out on bus 1, and lot of trial and error with different settings in Quartz, it finally started working..
Another snag that i had, that took a few hours of research was that with the Snow Leopard Mac osx 64 bit version, there is a slight bug. A MIDI note sent from Ableton that was, for example, on note C3, would be received in Quartz as C5, two octaves above. There is no answer for why this is as of yet. On forums and even developers at apple are not sure why this is. However the beauty of quartz and the developer tools is that they are open source. Therefore there is an open dialogue between apple and the user for updating and developing the software and it is only matter of time before someone solves this issue.
In The Quartz editor environment, there are several things that needed doing before the signal could be received.
MIDI notes receiver in place, the drop down menu - patch inspector, under the settings section, the channels can be selected here (received from Live). Once matched, the signal being sent out of Live to the notes receiver is now active.
The first thing that was done compositionally was to add a black backdrop. This was achieved through the library window and setting a backdrop.
A series of lines was then added through a line family patch in the from the Quartz library. An interpolator was then added and assigned to the 'enable' function of the line family patch, as well as being attached to the start and finish position on the z axis.
Other patches available were sprites and billboards. (Many more but for the basics and interoperating midi these were the easiest to use first time round...
Another thing to note is that hte area of the viewer window is 2x2. Everything is contained within these parameters in the patch inspector also. (axis parameters)....
Tuesday, 17 May 2011
Quartz - line families
Line families will influence my project greatly. They have a high amount of parameters and i used in conjunction with MIDI, random generators and Interpolation patches can yield some great results, most impressively in 3 dimensions.
Quartz Composer 1 - Learning new software / new language
Quartz Composer is a node-based visual programming language that comes as part of the mac OSX Xcode developer tools. It programs and renders graphics. The first thing about Quartz Composer is that it isn't like most development tools. Instead of writing pages worth of code to directly manipulate the various graphics, you work visually with processing units called patches. These patches are connected into a composition. As you work with a composition, adding patches and connecting them, you can visually see the results in a viewer window. Each and every change you make is immediately reflected in the viewer, no compilation required. This results in a development experience like no other.
Quartz Composer can be used to prototype Core Image filters, build screen savers, create custom user-interface widgets, make data-driven visual effects, and even perform live performance animations.
The Quartz Composer interface:
A patch is similar to a subroutine in a traditional programming environment. You can provide inputs to the patch, the patch will then execute and produce some results. Circles on the left side of a patch represent the various inputs the patch will accept. Circles on the right side are the outputs. The kinds of inputs a patch will accept and outputs it will create depend on its functionality.
These are the patches i made for MIDI TEST 2 that will be elaborated on in a later post.
Data inside QC can be one of the following types:
Boolean - a boolean value, 0 or 1
Index - a positive integer between 0 and 2147483647
Number - a double precision floating point number
String - a unicode string
Color - an RGBA or CMYK quartet, or a Grayscale value
Image - a 2D image of arbitrary (possibly infinite) dimensions
Structure - a named or ordered collection of objects, including nested structures
Virtual - any of the above
It is incredibly diverse. Its versatility has enabled me to use MIDI to manipulate graphics real time and create fully syncopated a/v.
This is an entirely new piece of software to me, and its going to be a huge learning curve.
I have no prior experience using it but it is more than capable of producing the results i want.
Quartz Composer can be used to prototype Core Image filters, build screen savers, create custom user-interface widgets, make data-driven visual effects, and even perform live performance animations.
The Quartz Composer interface:
Quartz Editor Window:
A patch is similar to a subroutine in a traditional programming environment. You can provide inputs to the patch, the patch will then execute and produce some results. Circles on the left side of a patch represent the various inputs the patch will accept. Circles on the right side are the outputs. The kinds of inputs a patch will accept and outputs it will create depend on its functionality.
These are the patches i made for MIDI TEST 2 that will be elaborated on in a later post.
Quartz Viewer Window:
Data inside QC can be one of the following types:
Boolean - a boolean value, 0 or 1
Index - a positive integer between 0 and 2147483647
Number - a double precision floating point number
String - a unicode string
Color - an RGBA or CMYK quartet, or a Grayscale value
Image - a 2D image of arbitrary (possibly infinite) dimensions
Structure - a named or ordered collection of objects, including nested structures
Virtual - any of the above
It is incredibly diverse. Its versatility has enabled me to use MIDI to manipulate graphics real time and create fully syncopated a/v.
This is an entirely new piece of software to me, and its going to be a huge learning curve.
I have no prior experience using it but it is more than capable of producing the results i want.
Subscribe to:
Posts (Atom)