Tag: iphone

Tips on how to setup, configure, initialize, and START/STOP Audio Units for iPhone

As part of Apple’s poorly documented information on Audio Units and Remote I/O, there are some subtle issues that cause RENDER CALLBACKS to not work properly.

Make sure not to set any “callbacks” until after Remote I/O unit is fully initialized (i.e. after calling AudioUnitInitialize)

An audio unit’s general opeartions are:
Open an audio unit (AudioComponentInstanceNew)
Configure it based on the context – AudioUnitSetProperty
Initialise the audio unit (AudioUnitInitialize)
– at this point the audio unit is in a state where it can render audio
Render audio (AudioUnitRender)

An important part of a render operation for an audio unit is to manipulate the various controls that the unit provides
to change the render effects; for instance to change the decay time of a reverb, the cut off frequency of a filter, etc.
These are called parameters, and AudioUnitGetParameter and AudioUnitSetParameter are used to interact with these.

If any reconfiguration of the audio unit is required, then:
uninitialise (AudioUnitUninitialise)
Configure it based on the context – AudioUnitSetProperty
Initialise the audio unit (AudioUnitInitialize)

Once the host is finished with an audio unit, it closes it:
Dispose audio unit (AudioComponentInstanceDispose)

Audio units can be used programmatically (for instance a mixers could be used to render audio for a game, a generator
to play audio files, etc), or they can be hosted in Digital Audio Workstation (DAW) applications such as Logic, Garage Band.
In the DAW case, it is common for an audio unit to provide a custom view to allow the user to interact with what can be
complex DSP opearations that the audio unit performs. The view is retrieved from an audio unit through AudioUnitGetProperty
and then the host instantiates it (see <AudioUnit/AUCocoaUIView.h>)


iPhone Audio Programming Tips

After extensive research into Apple’s poorly documented audio programming Objective-C classes and API, here are some helpful links.

Using RemoteIO audio unit

Decibel metering from an iPhone audio unit

Analyse Audio with RemoteIO

iPhone Core Audio tutorial

  1. http://timbolstad.com/2010/03/16/core-audio-getting-started-pt1/
  2. http://timbolstad.com/2010/03/16/core-audio-getting-started-pt2/
  3. http://timbolstad.com/2010/03/16/core-audio-getting-started-pt3/

It’s hard. Jens Alfke put it thusly:

“Easy” and “CoreAudio” can’t be used in the same sentence. :P CoreAudio is very powerful, very complex, and under-documented. Be prepared for a steep learning curve, APIs with millions of tiny little pieces, and puzzling things out from sample code rather than reading high-level documentation.

  • Media is hard because you’re dealing with issues of hardware I/O, real-time, threading, performance, and a pretty dense body of theory, all at the same time. Webapps are trite by comparison.

  • On the iPhone, Core Audio has three levels of opt-in for playback and recording, given your needs, listed here in increasing order of complexity/difficulty:

    1. AVAudioPlayer – File-based playback of DRM-free audio in Apple-supported codecs. Cocoa classes, called with Obj-C. iPhone 3.0 adds AVAudioRecorder (wasn’t sure if this was NDA, but it’s on the WWDC marketing page).
    2. Audio Queues – C-based API for buffered recording and playback of audio. Since you supply the samples, would work for a net radio player, and for your own formats and/or DRM/encryption schemes (decrypt in memory before handing off to the queue). Inherent latency due to the use of buffers.
    3. Audio Units – Low-level C-based API. Very low latency, as little as 29 milliseconds. Mixing, effects, near-direct access to input and output hardware.
  • Other important Core API’s not directly tied to playback and recording: Audio Session Services (for communicating your app’s audio needs to the system and defining interaction with things like background iPod player, ring/silent switch) as well as getting audio H/W metadata, Audio File Services for reading/writing files, Audio File Stream Services for dealing with audio data in a network stream, Audio Conversion Services for converting between PCM and compressed formats (and vice versa), Extended Audio File Services for combining file and conversion Services (e.g., given PCM, write out to a compressed AAC file).

  • Setting a property on an audio unit requires declaring the “scope” that the property applies to. Input scope is audio coming into the AU, output is going out of the unit, and global is for properties that affect the whole unit. So, if you set the stream format property on an AU’s input scope, you’re describing what you will supply to the AU.
  • Make the RemoteIO unit your friend. This is the AU that talks to both input and output hardware. Its use of buses is atypical and potentially confusing. Enjoy the ASCII art:


    -------------------------
    | i o |
    -- BUS 1 -- from mic --> | n REMOTE I/O u | -- BUS 1 -- to app -->
    | p AUDIO t |
    -- BUS 0 -- from app --> | u UNIT p | -- BUS 0 -- to speaker -->
    | t u |
    | t |
    -------------------------

    Ergo, the stream properties for this unit are


    Bus 0 Bus 1
    Input Scope: Set ASBD to indicate what you’re providing for play-out Get ASBD to inspect audio format being received from H/W
    Output Scope: Get ASBD to inspect audio format being sent to H/W Set ASBD to indicate what format you want your units to receive
  • That said, setting up the callbacks for providing samples to or getting them from a unit take global scope, as their purpose is implicit from the property names: kAudioOutputUnitProperty_SetInputCallback and kAudioUnitProperty_SetRenderCallback.


iPhone & iPod OS usage statistics (as of Dec 2009) & Android OS (as of June 2010)

Referenced from http://metrics.admob.com/2009/12/updated-iphone-os-stats/

Updated iPhone OS Stats

December 22nd, 2009

We received a couple requests recently for the distribution of the iPhone Operating System versions we see throughout our network.  The data below represents the percentage of worldwide traffic we saw from the iPhone and iPod touch from December 14-21st.

It shows that iPod touch users lag iPhone users in upgrading their OS; 97% of our iPhone traffic comes from 3.0 or higher, compared to only 68% of iPod touch traffic.

iphone

ipod

For developers working on Android applications, platform version might become even more important than it is for iPhone given the wide variety of handsets and capabilities already available.  Fortunately, the Android team has announced a device dashboard based on devices that visit the Android Market.  Visit the Android developer blog for details or the actual dashbaord for the most recent data.


Platform Versions

This page provides data about the relative number of active devices running a given version of the Android platform. This can help you understand the landscape of device distribution and decide how to prioritize the development of your application features for the devices currently in the hands of users. For information about how to target your application to devices based on platform version, see API Levels.

Note: This data is based on the number of Android devices that have accessed Android Market within a 14-day period ending on the data collection date noted below.

Android Platform Percent of Devices
Android 1.1 0.1%
Android 1.5 24.6%
Android 1.6 25.0%
Android 2.0 0.1%
Android 2.0.1 0.3%
Android 2.1 50.0%

Data collected during two weeks ending on June 16, 2010


OpenGL push/pop transformation matrix

For those familiar with 3D object manipulation, OpenGL uses a “transformation matrix stack” to allow you to apply changes only to certain parts of your OpenGL object rendering without impacting the rest of the rendering.  It operates similar to a software stack.  But it only works for translations, rotation, and scaling.  Too bad that the iPhone OpenGL ES does not support push/pop of OpenGL attributes (to control linewidths, etc.)

glPushMatrix();

// execute here your glTranslate(), glRotate(), glScale()
// if you are going to change the color, then restore it manually to 1,1,1,1

glPopMatrix();

OpenGL ES quirks on iPhone

It’s great that iPhone supports OpenGL, but there are a few things that are wanting (at least for things we wanted to do).

In this case, it was drawing vector art on the iPhone. We wanted to draw lines of varying widths, but iPhone does not support glPushAttrib and glPopAttrib.

Normally, if you want to change the state of a lot of different enviroment varibles, such as GL_LIGHTING, glpolygonmode, gllinewidth, things like that, you would use following code:

glPushAttrib(GL_ENABLE_BIT);
glPushAttrib(GL_LINE_BIT);
glPushAttrib(GL_POLYGON_BIT);

glPopAttrib();
glPopAttrib();
glPopAttrib();

Below is from http://www.bradleymacomber.com/coderef/OpenGLES/ on some of differences on iPhone OpenGL

OpenGL ES Limitations (on iPhone)

The subset of OpenGL for mobile devices is missing a lot of the typical functions. The exact details may come as a surprise. The Khronos site lacks any documentation explaining this. (Presumably this is an excuse for them to sell me a book.) So I am writing down the limitations as I find ’em. Most often the convention seems to be to eliminate functionality that is a convenient re-presentation of more fundamental low-level functionality.

No GLU library.
Some handy functions such as gluPerspective() and gluLookAt() will have to be replaced with manual calculations.
No immediate-mode rendering.
This means there are no glBegin() and glEnd() functions. Instead you must use vertex arrays or vertex buffers. This is no surprise since games shouldn’t be using immediate mode anyway.
Simplified vertex arrays.
The glInterleavedArrays() function is unavailable; each array must be specified separately, although stride can still be used. There are no glDrawRangeElements() nor glMultiDrawElements() nor glMultiDrawArrays() functions. Instead use DrawArrays() or DrawElements(). There is also no ArrayElement() function, which makes sense since it requires glBegin() and glEnd().
No quads.
All the usual geometric primitives are supported except for GL_QUADS, GL_QUAD_STRIP, and GL_POLYGON. Of course, these are provided for convenience and are almost always easily replaced by triangles.
Smaller datatypes.
Many functions accept only smaller datatypes such as GL_SHORT instead of GL_INT, or GL_FLOAT instead of GL_DOUBLE. Presumably this is to save space on a device with limited memory and a screen small enough that a lack of fine detail can go unnoticed.
GLfixed.
This new low-level datatype is introduced to replace a variety of datatypes normally presented. For example, there is no glColor4ub() function. Presumably this helps support devices which do not have a FPU.
No glPushAttrib nor glPopAttrib (nor glPushClientAttrib).
Well, this is annoying. I guess an iPhone application is supposed to be simple enough that we can keep track of all states manually, eh?
No GL_LINE_SMOOTH.
Enabling it has no effect.
No display lists.
Instead use vertex arrays or vertex buffers.
No bitmap functions.
Functions such as glBitmap() and glRasterPos*() do not exist. This means you cannot render simple bitmap fonts. Instead, textured quads must be rendered. Of course, you could always render vector fonts. Don’t let me stop you.
Texture borders not supported.
Probably not a big deal.

Guess that means I have to manually track some of these changes and revert them back when needed.  This is not always easy to do when other parts of your program can access OpenGL without your knowledge.


Starting to develop iPhone apps

Building iPhone apps is no small feat.  You have to be a good C/C++ programmer, you have to learn the Objective-C programming method, then learn the Cocoa frameworks.

But it’s great that a lot of people have created sample code and examples that showcase functionality.


Copyright 2009-2010 ZeroInverse.com