MyTunesController 1.3 available

otherwindows_13notification_13

What’s new:

  • Retina icons in the menu bar.
  • Holding option key when opening menu reveals “Update All Lyrics” menu item.
  • New “Update Current Lyrics” menu item for updating lyrics of the current track.
  • More compact preferences window.
  • 4 new and improved plug-ins for fetching lyrics.

Also I removed one plug-in what was causing prefixing lyrics with some garbage text. At the moment I do not have access for enabling auto update, hopefully github guys can help me to resolve it soon.

EDIT: auto-update working now.

Requirements:

  • Intel, 64-bit processor
  • OS X 10.7 or later

Download:

Embedding hyperlinks in NSTextField

Follow-up post to embedding hyperlinks in textview I wrote a while back. HyperlinkTextField is a basic NSTextField subclass which uses NSTextView internally for calculating frames for hyperlinks in the text and implements showing pointing hand cursor and handling opening urls. It is meant to be used where-ever it is needed to show non-selectable text containing clickable hyperlinks. The whole source code is available at github under MIT license. This project is a good starting point for implementing more complex text fields showing hyperlinks.

Download

NSTextFieldHyperlinks (github)

Capturing video from multiple devices simultaneously using QTKit

Basic capturing video from a device consists of a couple of steps. Finding a device, opening it, creating a capture session and setting input and output to it. More detailed info is found here. Next code snippet shows how to start capturing a video and displaying it on the QTCaptureView (ARC compatible code).

QTCaptureDevice *captureDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
if (captureDevice) 
{
 NSError *error = nil;
 if ([captureDevice open:&error])
 {
  QTCaptureDeviceInput *deviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:captureDevice];
  QTCaptureSession *captureSession = [[QTCaptureSession alloc] init];
  if ([captureSession addInput:deviceInput error:&error])
  {
   [self.captureView setCaptureSession:captureSession];
   [captureSession startRunning];
  }
  else
  {
   NSLog(@"%s Failed adding input device to session (device = %@, session = %@) with error (%@)", __func__, [captureDevice localizedDisplayName], captureSession, [error localizedDescription]);
  }
 }
 else
 {
  NSLog(@"%s Failed opening device (%@) with error (%@)", __func__, [captureDevice localizedDisplayName], [error localizedDescription]);
 }
}

In my sample application I am using QTCaptureLayers as the outputs of the capture sessions.

In CaptureViewController.m (ARC compatible code)

- (void)startCapturing
{
 self.capturing = YES;
 for (QTCaptureDevice *captureDevice in [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo])
 {
  NSError *error = nil;
  if ([captureDevice open:&error])
  {
   QTCaptureDeviceInput *deviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:captureDevice];
   QTCaptureSession *captureSession = [[QTCaptureSession alloc] init];
   if ([captureSession addInput:deviceInput error:&error])
   {
    QTCaptureLayer *sublayer = [QTCaptureLayer layerWithSession:captureSession];
    CGColorRef color = CGColorCreateGenericGray(0.8, 1.0);
    sublayer.backgroundColor = color;
    CGColorRelease(color);
    [[self.view layer] addSublayer:sublayer];
    [captureSession startRunning];
    [self _updatePixelBufferAttributesForSession:captureSession];
   }
   else
   {
    NSLog(@"%s Failed adding input device to session (device = %@, session = %@) with error (%@)", __func__, [captureDevice localizedDisplayName], captureSession, [error localizedDescription]);
   }
  }
  else
  {
   NSLog(@"%s Failed opening device (%@) with error (%@)", __func__, [captureDevice localizedDisplayName], [error localizedDescription]);
  }
 }
}

The trick of getting video from multiple devices simultaneously sits in the _updatePixelBufferAttributesForSession: method. The issue is that USB bus has limited bandwidth and when using multiple devices it is impossible to get video simultaneously. So the solution would be to set pixelBufferAttributes of the output.

- (void)_updatePixelBufferAttributesForSession:(QTCaptureSession *)session
{
 NSNumber *preferredHeight = [[NSUserDefaults standardUserDefaults] objectForKey:kLookoutPreferredVideoHeight];
 NSNumber *preferredWidth = [[NSUserDefaults standardUserDefaults] objectForKey:kLookoutPreferredVideoWidth];

 for (QTCaptureVideoPreviewOutput *output in [session outputs]) 
 {
  NSDictionary *attributes = [NSDictionary dictionaryWithObjectsAndKeys:preferredWidth, (id)kCVPixelBufferWidthKey, preferredHeight, (id)kCVPixelBufferHeightKey, nil];
  [output setPixelBufferAttributes:attributes];
 }
}

What then happens is that QTKit takes the preferred resolution and configures device output resolution for optimal performance taking account what resolution is set by the pixelBufferAttributes. This means that the actual resolution what the device is using might be different than set by the pixelBufferAttributes.

Download

Lookout (binary)

Lookout source (github)

MyTunesController 1.2 released

Version 1.2 is out of the door. I added a plugin based lyrics fetching architecture what makes me easy to implement new lyrics fetching sources. When lyrics window is opened missing lyrics are fetched. Also there is a new menu item “Fetch All Lyrics” what makes easy to fetch all missing lyrics.

MacUpdate

github (direct download link)

I use Git, what about you?

Git? What is that?

I was a user of SVN but thanks to my work colleagues I have converted myself into a Git user. And there is no way back. In this post I am sharing two good webpages for git beginners:

The first one has a list of useful commands and the second one talks about how to use branching when working on new features etc.

Core animation layer layout manager

Simple example how to create a layout manager for CALayer. In this example sublayers are laid out to random position. Layout manager needs to implement -layoutSublayersOfLayer:

- (void)layoutSublayersOfLayer:(CALayer *)layer
{
 CGSize superlayerSize = [layer bounds].size;
 CGRect sublayerFrame = CGRectZero;
 sublayerFrame.size = NSSizeToCGSize([[layer valueForKey:@"sublayerSize"] sizeValue]);
 // Take account layer's own size
 CGRect rect = layer.bounds;
 rect.size.width -= sublayerFrame.size.width;
 rect.size.height -= sublayerFrame.size.height;
 sublayerFrame.origin = CGPointMake(kSublayerSpacing, superlayerSize.height - sublayerFrame.size.height - kSublayerSpacing);
 for (CALayer *imageLayer in [layer sublayers]) 
 {
  sublayerFrame.origin = [self _randomPointInRect:rect];
  imageLayer.frame = sublayerFrame;
 }
}

Setting layout manager to a layer:

MessedUpLayoutManager *layoutManager = [[MessedUpLayoutManager alloc] init];
layer.layoutManager = layoutManager;
[layoutManager release];

Download

CALayout