An artist working with electronics and electronic media, based in Brooklyn, NY

Research

Update from the TextielLab in Tilburg, NL

IMG_0049_TextielLab_Documentation

IMG_0052 2_TextielLab_Documentation

IMG_0054 2_TextielLab_Documentation

IMG_0074_TextielLab_Documentation

IMG_0079_TextielLab_Documentation

IMG_0081_TextielLab_Documentation

IMG_0082_TextielLab_Documentation

IMG_0087_TextielLab_Documentation

IMG_0090_TextielLab_Documentation

IMG_0095_TextielLab_Documentation

Day 4 @ TextielLab Textielmusuem Tilburg, NL

The palette is fixed and I’ve settled on my final design constraints and source material.  For the next two working days in the lab, I’ll be weaving fragments from core memory dumps.  Raw binary data from my system RAM have been rendered into a 6-bit color-space with a total of 64 colors.  The data itself is a collection of fragments of files, images, sounds, temporary data and programs, a sketch of my activities assembled according to the obscure logic of my operating system.

Complete documentation of the process and resources will come in the following weeks.

After having my PC Laptop, camera, and audio recorder stolen on a train to Amsterdam, I am in debt to my dear friend Jeroen Holthuis for helping me write a program in Processing which performs variable bits per channel rendering of raw binary data in a similar fashion to Paul Kerchen’s LoomPreview. He has also been kind enough to loan me his camera and host me for some of my time in the Netherlands. Many thanks!


[STEIM] Breadboard Based Modular Synthesizer System by Pete Edwards and Phillip Stearns

From May 1 through May 14th, Pete Edwards and Phillip Stearns have been working on developing an open platform for endless musical and electronic invention, exploration, and discovery from the bottom up or the top down. This system is based on minimizing the differences in the input and output “languages” used in various musical electronic formats. This means finding a way to allow free communication between logic, analog and eventually digital electronics. We are working to achieve this by finding a middle ground between these mediums where signal format and amplitude can be shared freely with minimal need for translators and adaptors. Our proof of concept models have shown that unhindered communication between binary logic and variable analog systems renders wildly adventurous possibilities and a unique musical character.

The form factor ethos is one where our passion for invention and performance are given equal attention. The key to achieving this goal is designing a hardware system with maximal scalability of size, quality and hardware format. Thus allowing the experimenter to quickly and cheaply connect circuit boards with simple jumper wires. Meanwhile the traveling musician may prefer to adapt their system to be held in a rugged housing with large format control hardware. This is effectively achieved by adopting a standard layout for a set of core modules which can be built up to the appropriate scale using a series of shields and pluggable add ons.

After a series of discussion on what such a system might look like and how to establish a standard that could be as flexible as possible, allowing for the nesting of micro and macro elements, we began prototyping modules and stackable hardware interfaces.

Project documentation is still underway, with schematics for the prototypes still in development, however, we have, after only two weeks, produced a functional system that fulfills many of our goals including portability, quick system (re)configuration, open patchable interconnection architecture, and stable breadboard compatible form factor with the potential for stackable shields and interfaces.

Future plans discussed for the project include the development of VCO, VCA, and VCF modules that operate on 5 volts, releasing schematics and system specifications to the public, production of low profile breadboard compatible modules in kit and pre-fabricated form with options for either through hole or smd components.

A video demonstrating the 4000 series CMOS logic based modules can be viewed here.

The Module Prototypes:

Prototype of The Shifter module.

The Shifter – A dual 4-bit serial in parallel out (SIPO) shift register (CD4015) is connected as a single 8-bit SIPO shift register. Two 1 of 8 digitally addressable analog switches control two feedback taps which allow for each of the shift registers 8 outputs to be fedback to the register input. Input to the register is the output of four cascaded dual input XOR gates (CD4070) for a total of 5 possible inputs. The first two inputs are provided by the 1 of 8 switches, the third and fourth inputs are labeled as “mod” inputs for patching of any logic level signal, and the fifth input is connected to a “seed” button located on the lower left corner of the module. A logic level signal on the clock input will shift, or advance, the register once every positive going edge transition. Setting the feedback taps to the same state will fill the register with logic 0 each positive edge transition of the clock input. The register may need to be jump started by pressing the “seed” occasionally in the event that all outputs go low (lock up condition). The edge connector and header row provides connections for ground, power (3-18V), address and inhibit control inputs for each of the 1 of 8 switches, “mod” inputs, 8 parallel outputs of the register, and output from three of the XOR gates (1 = both feedback taps XORed, 2 = the second tap and “mod” inputs XORed, 3 = “mod” inputs XORed).

Prototype of the Divide by 2 by 2 by 2… module.

Divide by 2 by 2 by 2…- A single 12-bit binary counter (CD4040) takes a logic level signal and provides 12 sub-octaves, each available as patch points on the header on the left side of the module. Additionally, three 1 of 8 digitally addressable analog switches (CD4051) provide independent selection of the first 8 sub-octaves generated by the binary counter. The header row along the bottom provides connections for ground, power (3-18V DC), counter clock input, counter reset, address lines and inhibit control inputs for each of the three 1 of 8 switches, and the final four output stages of the binary counter.

Prototype of the Divide by 3-10 module.

Divide by 3-10 - This module divides a logic level signal frequency by integers 3 through 10. A 1 of 8 digitally addressable analog switch allows for the selection of the factor of division. A divide by 2 through 10 counter (CD4018) operates on feedback to establish the division factor and is used in conjunction with a quad 2-input AND gate (CD4081). The header row and connector provide connections for ground, power (3-18V DC), counter clock input, address lines and inhibit control inputs for the 1 of 8 switch, and the sub harmonic output.

Prototype of the Rhythm Brain module.

Rhythm Brain – Three binary rate multipliers (CD4089) share a common clock input and output pulses that are multiples 0-15 of 1/16th the logic level signal on the clock input. All chips share a common “set to 15″ input, which globally resets the pattern. Each chip has independent 4-bit addressable rate multiplication and inhibit controls. The edge connector and header row provide connections for ground, power (3-18V), 3 independent 4-bit address selection of rate multiplication and inhibit controls, and individual output for each chip. An additional set of outputs provide the compliment of the individual outputs on the header on the right side of the module.

Prototype of the 3-bit Digitizer module.

3bit Digitizer – An incoming analog voltage is digitized and quantized in real-time at 3-bit resolution. Two quad opamps (TL074) are used as comparators connected to a resistor network which sets 8 thresholds at equal intervals from 0v to the Voltage supply level. An 8-bit priority encoder (CD4532) is used to convert the comparator outputs to 3-bits. The edge connector and header row provide connections for ground, power (3-18V), 3-bit output in order LSB to MSB, enable output, gate select output, and the 8 outputs of the comparators.

 


Thoughts On “Is Augmented Reality the Next Medium”

In the age of the ubiquitous internet, 24hours is already a bit late to be posting a response to anything, but I had to be sure.  There is rarely any time for reflection, and much of the content of our electronic media is reflex.  These thoughts are on a recent opening and panel discussion at Eyebeam (Center for Art and Technology in Chelsea NY) concerning the topic of Augmented Reality.

At about 6:30 I arrived at the panel at which point the moderator, Laetitia Wolff was finishing her introductory remarks.  I caught enough to hear her point out the existence of as many video cameras on the earth as there are neurons in the human brain, connecting with the idea that this constitutes a form of an artificial intelligence equivalent with a human brain, or the possibility of one.  Though intriguing, admittedly it’s a bit disturbing to dream of the possibilities of an intelligence formed from the interconnection of electronic eyes.  With the announcement that the handsomely designed Google Glass will be made available this year (2013), one can’t help but wonder what it all could mean in the context of a the emergence of a potentially new medium.

Augmented Reality (AR) serves to visually enhance objects, spaces or people with virtual content.  It has the potential to dramatically change the relationship between the physical and digital worlds. (Henchoz)

The above excerpt from the “Is Augmented Reality the Next Medium” curatorial statement written by Nicolas Henchoz and provides a bit of context.  A good part of the discussion was occupied by mentions of graphic overlays (projections and heads up displays), physical objects with embedded information, and our mobile devices providing windows into new content.  Enough material to start any dreamer’s head spinning.

But it wasn’t that my imagination ran wild with possibilities that made it hard for me to follow the particulars of the conversation.  I was left wanting deeper insights, thirsty for critical dialog.  I found myself asking questions which were never fully addressed in the discussion.  A moment of relief came when Christiane Paul cautioned us to question this desire for further mediation that AR entails, but there was no real follow-up to this call to investigate what is staged, and to unmask theatricality.

It would seem that perhaps the most obvious question to address would be our ideas of reality and its relationship with the virtual.  A mention of Umberto Eco’s essay, “Travels in Hyperreality”, provided some insight.  Though not directly quoted by any of the panelists, here’s the paragraph referenced:

Constructing a full-scale model of the Oval Office (using the same materials, the same colors, but with everything obviously more polished, shinier, protected against deterioration) means that for historical information to be absorbed, it has to assume the aspect of a reincarnation. To speak of things that one wants to connote as real, these things must seem real. The “completely real” becomes identified with the “completely fake.” Absolute unreality is offered as real presence. The aim of the reconstructed Oval Office is to supply a “sign” that will then be forgotten as such: The sign aims to be the thing, to abolish the distinction of the reference, the mechanism of replacement. Not the image of the thing, but its plaster cast. Its double, in other words. (Eco)

It was pointed out that this instance of the Oval Office model served to illustrate a possible mode by which a simulation or replica functions.  The reproduction in the pursuit of realism becomes hyperreal, standing in for the thing itself.  Well beyond evoking a connection to the real, this form of realistic simulation becomes its own reality, and as such operates in its own unique way as a modifier of the potential experience of the real thing.  Despite this, however, further insight into what addition theoretical framework we have for approaching the notion of the Real, reality, and the virtual failed to surface.

In building Augmented Reality, there is a dynamic between the physical object or environment, its simulation through electronic media, the mediated experience of an overlay of virtual content, and the ways in which the experience of one spills over into the other.  Perhaps I yearned for some connection to the Lacanian theory of the Mirror Stage, but without a clear idea of how we formulate or notion of what we take to be the Real and the operation of the virtual within it, we stand little chance of understanding how this new reality will be used to control or influence perception.  Granted, not every new technology is evil, but they aren’t without their unintended consequences.  There’s going to be influence of some kind or another and we have to be aware of how to look for it.

It’s incredible to imagine just how many computation devices are in the world, currently connected by various wireless networks, and how many of those have cameras of some sort.  Though taken as a whole, can they possibly exhibit a human equivalent of intelligence?  Are we able to formulate criteria by which we can asses the level of intelligence such a system might have?  How does this equate to the level of intelligence of a single human, a small group, or the entire population?

When taken as a whole, the human species may be hardly more intelligent than slime mold.  As we currently understand it, intelligence comes from the connectivity between elements and the plasticity of those connections.  It’s not so much the structure itself, but the formation and revision of particular configurations.  Sadly, the point missed by the panel is that our digitally mediated environment must be programmed, and until it can program itself, we must do it.  The only information we can put into it will be limited by what we ourselves can input followed by the sophistication of the algorithms we write to automate that process.  Here is where there are clear sources of structural bias and issues of access.  Beyond that there are also the issues of interface and content filtering.

Jonathan Lee of Google UXA rightly lists inputs and outputs as chief technical challenges faced by designers of user interface (UI) frameworks for Augmented Reality.  There are no shortage of sensors today, and haptic interfaces allow for a wide variety of user control over content.  It seems that the problem is that there are almost too many inputs.  The question then becomes a matter of managing the inputs, of extracting information from the input streams and storing them in a way that enhances virtual content and a user’s experience of navigating that content.  Content and context aware algorithms solve this problem, but bring up other issues.  Our experience of the internet is already highly mediated by content filtering algorithms.  It can almost be argued that serendipity has been all but filtered out (they should make an app for that!) as individuals are catered to based on previously gathered information as interpreted by predictive algorithms (call for submissions: creative predictive algorithms).  On the broader issue of adaptive algorithms and similar forms of artificial intelligence, one has to ask what are the models for such algorithms?  They must be programmed at some point, based upon some body of data.  How do we select or craft the template?  Is a possible consequence of further refining the intelligence of our algorithms a normative model for intelligence?

Perhaps it might seem as though I’ve come unhinged, but these questions become important when we begin to approach the task of embedding objects with information.  What information or virtual content do we embed in these objects?  Who has the ability to do the embedding?  What are the possible system architectures that would allow for the system to become a place where the experience of an environment is actually enhanced.  What is the framework for approaching this issue of enhancement?

While you consider these, here’s some more of the curatorial statement:

The prospects of augmented reality are linked to a fundamental question: What makes the value of an object, its identity, our relationship with it?  The answer lies in the physical properties of the object, but also in its immaterial qualities, such as the story it evokes, the references with which it is connected, the questions it brings up.  For a long time, physical reality and immaterial values expressed themselves separately.  But with digital technology an object can express its story, reveal information, interact with its context and users in real time. (Henchoz)

It’s important not to mistake the map for the terrain.  Physical objects are already vessels of their own history as they are products of a particular sequence of events.  Those events, though external and broad in scope, can be decoded, traced and ultimately placed within a larger context of processes (not only physical ones but those with linkage to various cultural practices).  With digital technology, an object will not express its story, but always that of someone else.  To which we much ask, why that particular story?  How did it find its embodiment as embedded data in that particular object?  Is it a special case?  Why does this story come to us and not others?  If we open the system up for anyone to leave their story with any object, what do we do with hundreds of unique stories each told through a single object?  What of a world filled with such objects?  How do we navigate this terrain of embedded content?  The information revealed by an object through media will, on the surface, only be what is placed their by the one privileged with the ability to do so.  The nature of interactions will be limited to those programmed by those privileged enough to do so and the awareness equally limited.

The pieces in the exhibition did little to elaborate these deeper questions, or complicate the view of reality that values the particular form of Augmented Reality as put forward by Nicolas Henchoz.  The lack of imagination here comes off as almost tongue in cheek.  A microphone is placed before a drum kit rigged with mallets and drum sticks attached to actuators.  By making utterances of vocalizations into the microphone, the guests can use their voice to control the kit.  Mediation is dealt with as a translation or mapping of one kind of sound through a chain of electronic and mechanical processes to the production of another.  Elsewhere in the exhibition space there is a flat shallow museum display case without protective glass, in which various postcards, photos, notes, and objects have been placed.  iPads are locked and tethered to the case, provided for guests to view the objects in the display with the camera in order to reveal additional virtual content in the form of animations or video, suggesting a sort of lived experience beyond the inert relics.  In all there were seven pieces in the exhibition, of which two were not working after the panel discussion.  Despite technological foundations of the works presented, the whole exhibition space is filled with wide sheets of paper, gently curved around large cardboard tubes, evoking the sensation one might have of inhabiting a paper factory or new paper printing facility.

There are two major paradigms within average digital, electronic and media art: “the funny mirror” and “demo mode”.  The exhibition explored variations of these two paradigms to great effect, but with little affect.  But it’s still unclear whether this was all to be taken seriously, or if the whole panel discussion and exhibition is actually intensely subtle critique of current developments of AR.   The partners and funders list for the whole affair doesn’t do much to shed light on that matter, except to indicate that there are a group of respectable people taking this all very seriously, whether as an emerging new technology with radical potential as a profoundly transformative media or as a nuanced critique thereof.


Glitch Art Resources

I created a resource page for my class, “Doing it Wrong” at 3rd Ward.  The class is a short 3-hour Glitch Art techniques primer:

http://phillipstearns.wordpress.com/glitch-art-resources/

It is by no means complete. In fact, please contact me if you’d like for me to add something.  I’ll be updating this relatively frequently.


3D Printed Seashells

Listening to the Ocean on a Shore of Gypsum Sand is a collaborative project between Gene Kogan, Phillip Stearns, and Dan Tesene. Seashells are 3d printed from algorithmically generated forms for the sole purpose of listening to the “ocean”. The project questions the role of experience in the mediation of the virtual world to the real world and visa versa.

For those of us who have had the experience of listening to the sound of the ocean in actual seashells, it is a questions of lived experience shaping an approach, not only to the object (or world) at hand, but how it is perceived and acted upon.  Are we to trust these shells?  Do we seek out natural shells for comparison?

To those for whom their first experience of listening to the “ocean” through the digitally produced shell, the question becomes one of how the first encounter with a virtualized and simulated reality shapes the experience of lived space.  This virtual shell is all I know of the real, until I encounter those found in nature—and when I see this natural shell, what then is my experience of?  More broadly, how does mediated reality form our preconceptions of the world?

For some, these questions seem obvious—we may even have convinced ourselves that we have this all figured out.  We are aware of the possibility that the virtual world and real world are two interacting identities, distinct ideas that maintain their individuality despite their mutual influence on one another.  There is, however, a possibility that this distinction is fading with younger generations, as technologically mediated experiences permeate childhood.  I wonder about the effect of this as they grown into the world.

This project will be on view at Soundwalk 2012, a sound art festival in Long Beach, CA on September 1st 6-10pm.


Announcing Glitch Tapestries



Glitch Tapestries: Year of the Glitch Edition

Three all new weavings just came back from the mill: 36″ x 24″ tapestries.  Source images came from camera glitches featured in the Year of the Glitch posts 211, 198, and 192.

I’m offering these as rewards on the Glitch Textiles kickstarter campaign at the $225 level.


Inside the FujiFilm FinePix s9000

I was approached by Adam Ferriss, a Los Angeles based artist (check out his tumblr!), about some tips and tricks for circuit bending digital cameras.  His work with algorithmic image processing produces images that bear a striking resemblance to those produced by my prepared digital cameras.  The photography lab he runs at a college in LA was downsizing their inventory and getting rid of some antiquated FujiFilm FinePix s9000 cameras, and rather than throw them out, Adam decided to hang on to the lot and experiment with circuit bending them.

These things are beasts: fixed zoom point and shoot cameras with the look and feel of a DLSR but without any of the manual controls and flexibility.  No wonder they were getting rid of these things!

In exchange for  a couple of the less functional cameras, I agreed to help Adam by documenting my deconstruction process.  Bonus for you since, now I’m publishing the documentation for public consumption.

Disclaimer: If you’re going to attempt to prepare/modify/circuit bend/disassemble any electronic device, be aware that you are placing yourself at risk of serious injury or death from electric shock; electronic devices may be irreversibly damaged or destroyed (for what it’s worth, it goes without saying that all warranties will be void); if any loss of property or injury occurs, it will be solely your responsibility.

Getting Started:

Before opening up the camera, there are a few items we need to have on hand.

  • Precision Screwdrivers
  • Spare batteries or external power supply (the s9000 uses a 5V supply or 4x AA batteries)
  • A bag or containers to place screws and other bits in
  • A note book and camera for documenting
  • Anti-static wrist band

You’ll also need to do the following to prepare your camera.

  • Remove batteries
  • Remove the memory card(s)
  • Put on and ground the anti static wristband

Now we can begin.

Removing the Screws:

Remove all exterior screws.  Like all devices, there are screws in places you wouldn’t think to look.  Start with the bottoms, then move to the sides, open all compartments and look for those hidden ones.

remove the bottom screws

remove screws hidden in the flash assembly

Once these are out of the way, you should be able to remove the assembly with the shutter release button and the power and other operation mode switches.  Be careful not to pull too hard, like I did, and pull the ribbon connector out of its socket.  Fortunately, mine didn’t tear, but you may not be so lucky!

removing the shutter release assembly

There are still a few screws to be removed before you can open the back panel of the camera.  Both of these were revealed by removing the shutter release assembly.  One is right next to the strap loop, and the other is just below the flash assembly.

screw next to strap loop

screw next to flash fitting

With these two screws out of the way, you should be able to gently coax the back panel off until you encounter some resistance from a couple of pairs of wires. The red and black wires running from the hot shoe attach to a board on the main body via a connector. There’s a speaker on two black wires that attaches to another part of the circuit board via a similar connector. Disconnect these two and the back panel should open like an oven door.

the speaker and hot shoe wires and connectors

wohoo! we’re in!

Now that we’ve partially disassembled the camera, and exposed some nice looking innards, we need to figure out if it still works. You can either use the AA batteries or a 5V power supply with a 4.0mm x 1.7mm connector. I used to do a lot of testing for Voltaic Systems and have one of their solar rechargeable V60 batteries around for powering my small electronics projects. Make sure that the main ribbon connectors from the back panel and the shutter release assembly are in place (the speaker and hot shoe wires don’t matter), then power up your camera and turn it on. (hint: check that the battery and memory card doors are closed!)

Yay it works!

What’s Inside: Poking About

Now that we have the camera partially disassembled and still working, we can have a look at some of the components inside to see where a good place to start bending would be. Upon first glance, you’ll notice that all the parts are SUPER tiny smd. This is quite a let down, but exactly what you can expect with more contemporary devices. In fact, if you’re opening up cameras released today, you’ll probably find that most all the connections on the integrated circuits are actually underneath the chips and not via pins as with older style ICs!

So what can we mess with? There’s an Analog Devices chip (AD9996) that I can’t seem to locate the datasheet for. There’s something similar to it, the AD9995, which is a 12-bit CCD signal processor. You’ll notice too that there’s a thick connector with lots of contacts. This is the CCD connection (go figure it’s so close to the signal processor).

The AD9996 12 bit CCD signal processor is in the center with the CCD connector directly below.

I actually went a few steps further in deconstructing this camera and found that further disassembling made the system unstable. So, for now, you shouldn’t have to take the camera apart any further to tweek its brains.

How do I mess with it?

Since the pins and connections on this board are so tiny, I am hesitant to solder anything to it. One technique I always go to first is using a saliva moistened finger to poke the sensitive parts and see if anything happens. For capturing, you have two choices: movie or still. You can set the quality settings however you like. If you haven’t already inserted a memory card, now would be a good time.

Other strategies for altering the image is to use a small probe to short circuit adjacent pins on the CCD connector. I found that the right hand side of the connector worked best, and that the series of little smd ICs to the left of the AD9996 gave similar results. When I get in there with a soldering iron to draw out some of those points, I’ll be starting with the ICs and using very fine magnet wire.

So here are a few preliminary images.

I’ll be sharing more of my findings on my year-long glitch-a-day project,Year of the Glitch.


DCP Series Update

An animation created using a collection of still produced during a session of circuit bending a Kodak DC215 1 megapixel digital camera.

Just uploaded 100+ new images to the DCP Series. This series is still well underway and is branching out into other cameras. Updates to the Olympus PIC series will be ready soon.

Be sure to check out the new additions to the Glitch Textiles project as well. There are currently nine blankets in the collection so far, each featuring a pattern woven directly from an image generated with the prepared cameras of the DCP Series and Year of the Glitch project.


Cameras of Year of the Glitch

This slideshow requires JavaScript.

Images and information about three cameras used in recent Year of the Glitch posts and the DCP Series are now public.  The Kodak DC280, DC215, and DC200/210, have played a key role in my exploration of hardware and software based image generation/corruption.  In the near future, these pages will be updated with more detailed information about specific techniques and circuits involved in creating the variety of images found in both the YOTG project and DCP Series.

Cameras Featured


Kodak DC 280

Kodak DC 200/210

Kodak DC 215

Bernard Garnicnig’s “Almost White”

This slideshow requires JavaScript.


Selections from Almost White by Bernhard Garnicnig (source: flickr – images taken between 29 Mar 2006 & 04 Aug 2010)

Photographs taken to set a camera’s white balance.

“Almost all cameras allow the user to set the photographic white point manually. To make this setting on some cameras, you have to shoot a picture of a usually white surface and set it as the white point reference.

These pictures, usually deleted right after confirming the setting, question the concept of subjective realities in the photographic process and document the photographers surroundings from his part unconscious, part mechanic eye. This is one of the last kinds of photography where no post processing is applied by a human while it shows how much the camera is manipulating the image already.

Its one of the last snapshots of photographic truth in the digital imaging age.” — B. Garnicnig

So much of the world is left on the cutting room floor. This is a necessary part of relating experience, whether in the transmission of factual information or the telling of story based on fact. Not every detail is needed in order to give a general picture or to convey the essence of an idea or experience, nor can every detail be captured, recalled, or communicated. Exactly what is omitted reveals the circumstances (bias) around and through which the material aspects of an experience are transformed by the process of relating and crafted into media.

Where lived subjective experience is often filled or obscured by the mediated experience of information displayed or rendered through electronics, today those scraps are increasingly difficult to find. Our culture of digitally mediated exchanges has been carefully structured to remove the perception of the framework through which we conduct our daily activities. The unwanted bits are tucked away on our hard-drives or tossed in the recycling bin—in some cases, deleted in-camera. Everything is curated, edited, cleaned and polished (even the raw webcam feed is a considered choice to convey honesty).

It’s not so much an issue of noise—the din of cellphone rings, tinny ear buds cranked way too high, the drone of our air handling systems and refrigeration units, the screeching, grinding and rumbling of our transportation machines is certainly something we will not easily rid ourselves of—the dust that imposed itself on the grooves of a record, the grain of a piece of paper and the pen-in-hand overcoming it to scrawl a letter, the grit of static and dead air between the stations, are all disappearing.

It is not so much a sense of nostalgia but a reflection, looking back on where we were but a few years ago, while understanding that today digital media strives for higher levels of fidelity (which ironically force television personalities to pursue more extreme methods to alter their physical appearance; this in addition to the artificial sharpening and saturation applied to just about every image these days) in an attempt to look forward into the potential media of the future where everything is so radically fabricated and manipulated that there is no honesty, no substance, no reality left but a simulated phantom of what once was.

What is striking to me about Bernhard’s Almost White series is that it brings to the surface issues of the photographic medium, how its digitization has been quietly accepted as Photography wholesale. The question of what makes photography different than digital imaging has been unearthed. By using these artifacts as evidence of the manipulation nearly every digital image undergoes, Bernard opens the door to questioning the honesty behind every media image. Even tempting us to ask exactly how staged are these white balance calibration shots? Has “fidelity” in the digital (dark) age become a matter of passing off artificially enhanced hyper-realism as reality? Or has it becomes something more subtle, staging reality in such a way that the extraordinary seems mundane?


Gallery

Glitch Textiles

Experimenting with making woven blankets out of images from Year of the Glitch.  Here are some photos of tests.  #32 is featured in there!

There are 4 blankets in this collection.  The first four images are two blankets made with a mechanized knitting process.  The last two images are two different blankets made using a Jacquard loom.

More on Glitch Textiles.


Year of the Glitch: A 366 Day Project for 2012

yearoftheglitch.tumblr.com

“56. What makes good glitch art good is that, amidst a seemingly endless flood of images, it maintains a sense of the wilderness within the computer ” — Hugh S. Manon and Daniel Temkin, “Notes on Glitch”

Year of the Glitch is a 366 day project aimed at exploring various manifestations of glitches (intentional and unintentional) produced by electronic systems.

Each day will bring a new image, video or sound file from a range of sources: prepared digital cameras, video capture devices, electronic displays, scanners, manipulated or corrupted files, skipping CDs, disrupted digital transmissions, etc.

These images are not of broken things, but the unlocking of other worlds latent in the technologies with which we surround ourselves.


Glitch Theory: “Notes on Glitch”

Jose Irion Neto, Untitled Databent JPEG-LS (2010)Jose Irion Neto, Untitled Databent JPEG-LS (2010)

In its 6th edition, titled “Wrong”, The online journal, World Picture, recently published an article, “Notes on Glitch” by Hugh S. Manon and Daniel Temkin with a companion gl1tchw0rks gall3ry curated by Temkin.

“Notes on Glitch” covers an impressive amount of ground, offering perspectives on well known problematics of the newly emerging form of Glitch art, theorizing about issues of authenticity, effort, aesthetics, methodology, materialism, as well as presenting some interesting trajectories for further thought.

This article is by no means comprehensive and is in no way making a claim to be. It does put together a great resource for those interested in learning more about this growing phenomenon within electronic culture. I’m certainly excited about the conversations this piece of Glitch theory is sure to generate within the community and beyond.


Interpolation Studies

A pixel level study of RAW format interpolation algorithms on noise introduced by manually short circuiting a digital camera. Specific models used in this group of images include the Canon G5 and Canon EOS Digital Rebel.


Prepared Olympus C-840L

A collection of images generated using a prepared Olympus C-840L 1.3 MP digital camera. The camera was a gift from Jeff Donaldson, purchase in Japan for 300 Yen ($3 USD).


Compression Study 01

 

With the opening of the Algorithmic Unconscious group show at Devotion Gallery earlier this month, my interest in iterative video processing has been renewed as a method of exploring compression algorithms.  You might be familiar with the technique, it was the same used for the epic Alvin Lucier inspired Video Room where YouTube user canzona uploads, downloads and re-uploads a video to youtube 1000 times.  Where his work explores the impact of the compression schemes native to YouTube, the new video work above explores the motion JPEG-2000 compression algorithm.

The source video is a custom made 16 second loop cycling through the 8 fully saturated primary and secondary additive colors—black, red, yellow, green, cyan, blue, magenta, white.  In quicktime, the JPEG-2000 compression algorithm is chosen to export a .mov file of the lowest quality (smallest size).  At this setting, the compression algorithm is repeatedly making decisions concerning what information is relevant or important while discarding the rest—up to 99% of the original data.  The result is a considerably low quality reproduction of the original with visible data-compression artifacts.  By applying a handful of filters to the compressed file and then re-compressing, data-compression artifacts are amplified.  By repeating this iterative or recursive process hundreds of times, an effect similar to feedback is achieved where the visual output becomes degraded from the original and the artifacts take on a generative nature.

For this study, 193 iterations were time compressed to fit within a roughly 10 minute span.  The video was then paired with audio from “Metamorphopsia”, a track from the Macular Degeneration project.

Leading up to this completed study, several attempts were made to work with h.264 on fades between black and white frames.  Similar work was down with audio compression algorithms and white noise.  Further works in this series will investigate the effects of different compression algorithms on simple patterns of varying motion, shapes, and transition effects.

As a note, this work is less about abstraction and more about taking the concepts of Concrete Art to a place where expression re-emerges through the algorithm, which I am taking to be an abstraction of human perceptual features mediated by a deterministic system of discrete logic.


Incomplete Darkness

Inspired by the recent appearance of lenscapped work by Jeff Donaldson, Incomplete Darkness is a new series of digital photographs utilizing the sensor noise as image source.


Motion Blur Photographs

A new collection of long exposure digital photographs taken from moving vehicles.  This set, Bulb, was shot from a train heading into NYC.  Developing different sets of images from the DCP Series, has inspired me to re-create some of the effects of digital artifacts using different techniques, favoring the manipulation of light and exposure time rather than directly manipulating the circuitry of the digital imaging devices.  The next step in for this series will be to switch over to film or direct exposure of photographic paper.


No Input Mixer + Digitally Controlled Light + Scanner

Grayson Bagwell recently inspired me to begin working with prepared scanners.  After taking apart an HP F335 all in one printer scanner combo, I got the bright idea to replicate some of my favorite op-art-esque images produced with the Kodak digital cameras in the DCP Series.  The image above was created by scanning a CFL bulb that was being controlled by audio signals generated by mixer feedback.

 

 

 

 


Hacking the Logitech C270

Picked up a Logitech C270HD 740p webcam on ebay for about $23.  While waiting for hurricane Irene to arrive, I’ve been prodding about the innards, mostly the CCD element, looking for anything interesting…


Remnants: zip tie line drawings

Remnants_002 - Click for Flickr Set

Entity I, Fruiting Bodies of High-Voltage Transmission Lines, Alpha, Beta, Gamma, The Owls Are Not What They Seem—each of these projects involves the use of zip ties or cables ties to bind wires together, and once trimmed, these fragments are left behind as waste.  I have been saving these zip tie clippings for the past two years, the collection growing with each installation of the works mentioned above.  The collecting began when at the end of a long night in the studio binding the wires for Entity I I found the floor littered with hundreds of zip tie fragments.  I gathered them up, but couldn’t bring myself to simply throw them out.  Something flashed through my mind—Karmic guilt perhaps.  Each one of these zip tie fragments is so much like a blade of grass, only these will last a thousand lifetimes.

The Remnants series is a collection of sketches, scans of configurations half tossed randomly onto the surface of a scanner and some more deliberately positioned arrangements.


Photogenesis

Meditations on chemical and digital photographic processes

DCP_0022

In non-digital photography—the “capture” of images through exposure—a moment in time is sublimated into a successive process of chemical mediations.  These translations are obscured in the resulting photographic image except to the skilled who can recognize certain chemical techniques for enhancement or manipulation. The deception of the photographic image lies in the obfuscation of technique—texture is an illusion resulting from light playing off surfaces or through objects.  In painting, the technique, because it always produces a certain texture, becomes integral to the perception of the work and its content.  Perhaps it is because painting must transcend or reconcile with its deception, and that it is not simply an image, that distinguishes it from photographic image making—the subject or referent is simulated through the illusion of light created through the application of paint on a surface, where in photography it is a photo-chemical impression upon physical material, a literal play of light upon surfaces.  The digital images from the DCP Series complicate this issue of texture and technique.  They exhibit a richness in detail, where the technique of manipulating the electronics of the camera asserts itself as simulated texture within the image, not in such a way as to reclaim that domain of texture occupied by painting but to draw attention to the fact that it—the digital image itself—is almost pure simulation, that there are many imperceptible layers of mediation involved in the production of the digital image which remove it from its referent.

DCP_Series - Modified Kodak DC280

DCP_Series - Modified Kodak DC280

The referent in the DCP Series images is the process of digital photography revealed through intervening with the physical hardware during image capture. Here the illusion of texture arises from the play of data through algorithms; light, and therefore exposure, is amputated from the digital photographic process.  Where the mediations separating the real from the simulated within non-digital photography involve photo-chemical transformations of materials via exposure and development, the mediations involved in the creation of the digital images in the DCP Series involve complicated algorithms which are made visible through the intervention of wires intersecting processes by connecting points on the circuit boards which were never intended to meet.  Though the specifics of the tools and methods involved in both practices are radically different, because digital photography evolved from non-digital photography, there exists not only an overlap but a discontinuity between the two.  By scrutinizing work produced at the limits of each practice, and attempting to locate the essence of one within the other, the possibility of creating new forms arises.

Locating the analog of the physical process of manipulating the circuits of digital cameras in the photographic process poses an interesting set of problems.  That the image of film based photography exists in a physical domain and the image of the digital era exists as a data set corresponding to the charges stored in vast arrays of microscopic capacitors already complicates any attempt to unite the domains of digital and photo-chemical image making.  The translation of light to a data set makes the digital camera an all in one image making machine; you don’t need to have a photo lab to produce images.  Data acquisition and storage; data read back and software interpretation of data; and output to the monitor replace the processes of exposing and developing film and then exposing and developing photographic paper.  Algorithms and silicon replace film, paper and chemical baths.

Parallels to the process of intervening in the electronics of the camera can be found in chemically processing unexposed film.  Created completely in the darkroom through the application of different chemicals directly on the film emulsion, the resulting images circumvent the need to expose film to light.  This raises the question of whether a photographic image requires the exposure of film at all, or whether its development takes precedence in the creation of photographic images.

Man Ray’s photograms alter our perception of the processes that define photography by discarding not only film, but the lens and the camera altogether.  By inserting physical objects between light and photographic paper to create images, the mechanism of the camera—the voyeur’s perspective onto the world—is circumvented.  In the digital domain, instead of adding objects to photographic paper, the addition of objects to the circuitry—alligator clips and wires—circumvents the cameras inherent image capturing capabilities.  However, because the process of modifying the cameras used in the DCP Series overrides the process of exposure, the Rayogram still falls short as a suitable analogy with which to locate the resultant digital images within the context of traditional photography.

Is it still possible to have a photograph without any of the mediums being exposed to light?

If images produced by developing unexposed, but chemically manipulated positive film or photographic paper (chemigram) can still fall under the umbrella of photography, then we have shifted the emphasis of photography from the subject, light, and exposure, to the chemical process of development which may not even involve light (except in the mediation of the electromagnetic forces responsible for chemical reactions).  To develop a single frame of unexposed (positive) film and/or an unexposed sheet of photographic paper would exemplify this process.  The question is now: where can we locate the notion of development within the practice of digital photography?

Inside the prepared digital camera, the element typically exposed to light in the production of an image, the CCD, is bypassed and the electronics responsible for interpreting its signals and writing them to a digital storage medium are manipulated to produce the image.  The process of data acquisition, processing and storage is akin to exposing film to light, and developing its negative.  When the data is read back, it is interpreted by decompression algorithms and presented on a screen.  With this software, the data set that describes the image can be manipulated using any number of mathematical operations.  This whole process of generating data and interpreting it as an image could of course be emulated within software, but the result would involve neither the mechanisms of exposure nor development in any traditional sense and thus the result could not be considered photographic.  Digital images produced within the camera occupy this interstitial zone between photography and algorithmically generated imagery, because the tools involved are designed to focus light, expose a surface and record the resulting data.  Perhaps by circumventing the process of exposure, the images produced by these prepared cameras cannot be considered photographic in any traditional sense.

DCP_0055

It is still tempting to identify the process of creating these images with photography.  The shutter release is still involved; however, the act of initiating an exposure is abstracted, initiating a Rube Goldbergesque chain of pre-programmed instructions, where photons generate electrical signals which are quantized and stored as data points.  After the intervening processes employed in the production of the DCP Series, the digital camera thinks it’s taking an exposure but the paths from the CCD to the recording device have been severely compromised.  By bypassing the CCD electronics, we intercept the digital processes of “development”—analog to digital conversion, compression algorithms, etc.— and dump our redirected electrons onto what would in film photography been the exposed and developed negative: the flash memory card.  It’s like taking a picture with the shutter mechanism disabled and afterwards bathing the film in a cocktail of different chemicals; you trigger the mechanics of an exposure but what happens in the treatment of the “film” is what we’re concerned with.  You could almost discard the camera altogether, except that in the digital camera the translation of the image from CCD to storage medium—what would otherwise be from film to developed negative and then to photograph paper etc.—is dependent upon the system of components and short-circuits that have no algorithmic equivalent, they escape the type of emulation that would allow us to forget about the physical object altogether.

No doubt, this whole process is, in the end, digital, but perhaps there is hope that it is actually a possibility to contextualize it within the domain of photography and not simply relegate it to the domain of digital image production.  It may be that by preparing poloroid cameras so that the film is physically damaged while it is being pulled through the mechanisms, we find the closest parallels to these images in the DCP Series.

As a final note, this whole exercise of attempting locating this work within the tradition of photography is necessary because it is not based in emulation.  The act of using a digital camera locates the resulting image within the practice of photography. The question then is if altering the electronics of the camera is a photographic process, does it have a precedent from previous photographic traditions and if so in which specific stage of the whole process can we find the closest similarities? Of course, I’m also interested in how this obscures the definition of photography—whether digital or film-based—and also whether there are other practices that have touched upon this problem of “what is photography?”. So that the traditionalists may understand the images and the process in terms of what they already know, we can refer back to those artists who are chemically manipulating unexposed film and developing the results. Though the analogy is not a perfect match, the form of photography discovered and exploited in the production of the DCP Series is the digital age’s answer to those artists.

See Also

Artists:
Pierre Cordier
Polli Marriner
Francoise André

Reading list:

Luis Nadeau, Encyclopedia of Printing, Photographic and Photomechanical Processes New Brunswick, NJ (Atelier Luis Nadeau), 1989, and the related website, photoconservation.com

Gordon Baldwin, Looking at Photographs: A Guide to Technical Terms Los Angeles and London (J. Paul Getty Museum in association with the British Museum Press), 1991


Follow

Get every new post delivered to your Inbox.

Join 5,800 other followers