I do not trust Dash by Kapeli

I removed Dash and will likely avoid apps written by Bogdan Popescu until I feel I can trust him again. I relied on Dash for quickly accessing documentation from multiple projects, all within an ubiquitous, unified, and polished interface. Sadly, I no longer trust the app or its developer and will seek alternative solutions for accessing documentation.

On October 5, Apple removed Dash from the App Store for fraudulent activity. I could not understand why a popular app would be be involved with fraudulent reviews.  Even after some of the oddities were explained in a statement by Bogdan, I feel like we lack details that would allow me to trust him.

The tipping point for me in my decision to not trust Dash is that the app is not sandboxed. Although the App Sandbox would hamper Dash in some ways, I would feel much better knowing that Dash could not access my personal private information without my explicit permission. While I do not believe Bogdan is a bad actor, even if my belief was wrong the App Sandbox would limit the damage such a bad actor could inflict.

I want to believe that Bogdan has only the best intentions.  Applying correct App Sandbox entitlements to Dash would restore my trust in the app and its developer.

Deep Learning at Apple

At the WWDC 2016 keynote event, Craig Federighi showed the new “Advanced Computer Vision” capabilities that will be integrated in the Photos app.  Facial recognition (previously only part of Photos for OS X macOS) is now supplemented with more general object and scene recognition.  The end result is the ability to not just present a list of photos based on metadata from the time of capture (e.g. date, location), but new ways to automatically organize and search your photos given better understanding of the subjects and context of each shot.

Apple is claiming that the computer vision and deep learning algorithms are all run on the user’s device to protect user privacy. In contrast, most other solutions run in the cloud (Google Photos).

How can these computationally complex algorithms run efficiently and not drain the batter of mobile devices?

Enter the Basic Neural Network Subroutines (BNNS).    As part of the Accelerate framework, Apple has added new APIs for efficiently running artificial neural networks!  From the reference documentation:

A neural network is a sequence of layers, each layer performing a filter operation on its input and passing the result as input to the next layer. The output of the last layer is an inference drawn from the initial input: for example, the initial input might be an image and the inference might be that it’s an image of a dinosaur.

While the new APIs make it easy to efficiently run a neural network, the training data that is so crucial must be provided by the user of the API.

BNNS supports implementation and operation of neural networks for inference, using input data previously derived from training. BNNS does not do training, however. Its purpose is to provide very high performance inference on already trained neural networks.

It looks like WWDC Session 715 will cover the new neural network acceleration functions.

As for the new face, object, and scene recognition in Photos, where is Apple getting its training data?

I am comfortable with both 1- and 0-indexed array addressing. The importance of this cannot be understated.  I should make a prominent update to my resumé.

(For the uninitiated, both MATLAB and Julia are 1-indexed [i.e., array[1] is the first array element] while C derived and other languages begin array indexing with 0.)

My next topic will be on tracking mmWave channels for the purpose of beamforming or precoding.

  1. Devise a dynamic channel model that captures variations in the channel parameters (e.g. angles of arrival and departure, path gain).
  2. Review tracking methods in literature (e.g. Kalman filters).
  3. Choose and describe the tracking method.
  4. Simulate the system with the chosen tracking method.
  5. Iterate and improve.

OS X Hypervisor Framework

It looks like Apple has added system APIs for user space hypervisors. The APIs might even be compliant with Mac App Store regulations.

Check out the Hypervisor Framework and  vmnet Framework.

The xhyve project (started by Michael Steil) is a port of bhyve (the FreeBSD hypervisor project) to work with these new APIs.

Unfortunately, the use of the com.apple.vm.* entitlements seems to be restricted to Mac App Store apps only.  I have not seen any documentation on how to obtain the relevant entitlements.