Learn to build human-interactive Android apps, starting with device sensors
This book shows Android developers how to exploit the rich set of device sensors—locational, physical (temperature, pressure, light, acceleration, etc.), cameras, microphones, and speech recognition—in order to build fully human-interactive Android applications. Whether providing hands-free directions or checking your blood pressure, Professional Android Sensor Programming shows how to turn possibility into reality.
The authors provide techniques that bridge the gap between accessing sensors and putting them to meaningful use in real-world situations. They not only show you how to use the sensor related APIs effectively, they also describe how to use supporting Android OS components to build complete systems. Along the way, they provide solutions to problems that commonly occur when using Android’s sensors, with tested, real-world examples. Ultimately, this invaluable resource provides in-depth, runnable code examples that you can then adapt for your own applications.
- Shows experienced Android developers how to exploit the rich set of Android smartphone sensors to build human-interactive Android apps
- Explores Android locational and physical sensors (including temperature, pressure, light, acceleration, etc.), as well as cameras, microphones, and speech recognition
- Helps programmers use the Android sensor APIs, use Android OS components to build complete systems, and solve common problems
- Includes detailed, functional code that you can adapt and use for your own applications
- Shows you how to successfully implement real-world solutions using each class of sensors for determining location, interpreting physical sensors, handling images and audio, and recognizing and acting on speech
Learn how to write programs for this fascinating aspect of mobile app development with Professional Android Sensor Programming.
Inhaltsverzeichnis
INTRODUCTION xxvii
PART I: LOCATION SERVICES
CHAPTER 1: INTRODUCING THE ANDROID LOCATION SERVICE 3
CHAPTER 2: DETERMINING A DEVICE’S CURRENT LOCATION 11
CHAPTER 3: TRACKING DEVICE MOVEMENT 27
CHAPTER 4: PROXIMITY ALERTS 45
PART II: INFERRING INFORMATION FROM PHYSICAL SENSORS
CHAPTER 5: OVERVIEW OF PHYSICAL SENSORS 65
CHAPTER 6: ERRORS AND SENSOR SIGNAL PROCESSING 103
CHAPTER 7: DETERMINING DEVICE ORIENTATION 121
CHAPTER 8: DETECTING MOVEMENT 147
CHAPTER 10: ANDROID OPEN ACCESSORY 189
PART III: SENSING THE AUGMENTED, PATTERN-RICH EXTERNAL WORLD
CHAPTER 11: NEAR FIELD COMMUNICATION (NFC) 219
CHAPTER 12: USING THE CAMERA 255
CHAPTER 13: IMAGE-PROCESSING TECHNIQUES 281
CHAPTER 14: USING THE MICROPHONE 303
PART IV: SPEAKING TO ANDROID
CHAPTER 15: DESIGNING A SPEECH-ENABLED APP 333
CHAPTER 16: USING SPEECH RECOGNITION AND TEXT-TO-SPEECH APIS 349
CHAPTER 17: MATCHING WHAT WAS SAID 407
CHAPTER 18: EXECUTING VOICE ACTIONS 441
CHAPTER 19: IMPLEMENTING SPEECH ACTIVATION 471
INDEX 495
Über den Autor
Greg Milette is a professional Android developer and founder of Gradison Technologies, an app development company. He enjoys building practical apps like Digital Recipe Sidekick and contributing to Stack Overflow.
Adam Stroud is the lead developer for the Android version of Run Keeper. He is a self-proclaimed ‚phandroid‘ and is an active participant in the Android virtual community on Stack Overflow and Android Google groups.