Table of contents

iOS User Guide for Document Scanner Integration

System Requirements

  • Supported OS: iOS 11 or higher (iOS 13 and higher recommended).
  • Supported ABI: arm64 and x86_64.
  • Development Environment: Xcode 13 and above (Xcode 14.1+ recommended).

Add the SDK

There are two ways to add the SDK into your project - CocoaPods, or via Swift Package Manager.

Add the xcframeworks via CocoaPods

  1. Add the frameworks in your Podfile.

    target 'HelloWorld' do
       use_frameworks!
    
    pod 'DynamsoftCaptureVisionBundle','2.6.1000'
    
    end
    
  2. Execute the pod command to install the frameworks and generate workspace(HelloWorld.xcworkspace):

    pod install
    

Add the xcframeworks via Swift Package Manager

  1. In your Xcode project, go to File –> AddPackages.

  2. In the top-right section of the window, search “https://github.com/Dynamsoft/capture-vision-spm”

  3. Select capture-vision-spm, choose Exact version, enter 2.6.1000, then click Add Package.

  4. Check all the frameworks and add.

 

Build Your First Application

This guide will walk you through the process of creating a HelloWorld app for detecting and normalizing documents via a camera video input.

Note:

  • Xcode 14.0 is used in this guide.
  • You can get the source code of the HelloWorld app from the following link

Create a New Project

  1. Open Xcode and select create a new project.

  2. Select iOS -> App for your application.

  3. Input your product name (HelloWorld), interface (StoryBoard) and language (Objective-C/Swift).

  4. Click on the Next button and select the location to save the project.

  5. Click on the Create button to finish.

Include the Library

Add the SDK to your new project. Please read Add the SDK section for more details.

Initialize License

Initialize the license first. In your ViewController file, add the following code to initialize the license.

  • Objective-C
  • Swift
  1. // Import the DynamsoftLicense module to init license
    #import <DynamsoftLicense/DynamsoftLicense.h>
    // Add LicenseVerificationListener to the interface
    @interface ViewController ()<DSLicenseVerificationListener>
    - (void)setLicense{
       [DSLicenseManager initLicense:@"DLS2eyJvcmdhbml6YXRpb25JRCI6IjIwMDAwMSJ9" verificationDelegate:self];
    }
    -(void)onLicenseVerified:(BOOL)isSuccess error:(NSError *)error
    {
       NSLog(@"On License Verified");
       if (!isSuccess)
       {
          NSLog(error.localizedDescription);
       }else
       {
          NSLog(@"License approved");
       }
    }
    ...
    @end
    
  2. // Import the DynamsoftLicense module to init license
    import DynamsoftLicense
    // Add LicenseVerificationListener to the interface
    class ViewController: UIViewController, LicenseVerificationListener {
       func setLicense(){
        LicenseManager.initLicense("DLS2eyJvcmdhbml6YXRpb25JRCI6IjIwMDAwMSJ9", verificationDelegate: self)
       }
       func onLicenseVerified(_ isSuccess: Bool, error: Error?) {
          // Add your code to do when license server returns.
          if let error = error, !isSuccess{
             print(error.localizedDescription)
          }
       }
       ...
    }
    

Note:

  • Network connection is required for the license to work.
  • The license string here will grant you a time-limited trial license.
  • You can request a 30-day trial license via the Request a Trial License link

 

Main ViewController for Realtime Detection of Quads

In the main view controller, your app will scan documents via video streaming and display the detect quadrilateral area on the screen. First of all, import the headers in the ViewController file.

  • Objective-C
  • Swift
  1.    #import <DynamsoftCore/DynamsoftCore.h>
       #import <DynamsoftDocumentNormalizer/DynamsoftDocumentNormalizer.h>
       #import <DynamsoftCaptureVisionRouter/DynamsoftCaptureVisionRouter.h>
       #import <DynamsoftUtility/DynamsoftUtility.h>
       #import <DynamsoftCameraEnhancer/DynamsoftCameraEnhancer.h>
    
  2.    import DynamsoftCore
       import DynamsoftCaptureVisionRouter
       import DynamsoftDocumentNormalizer
       import DynamsoftUtility
       import DynamsoftCameraEnhancer
    

Get Prepared with the Camera Module

Create the instances of CameraEnhancer and CameraView.

  • Objective-C
  • Swift
  1. @property (nonatomic, strong) DSCameraEnhancer *dce;
    @property (nonatomic, strong) DSCameraView *cameraView;
    ...
    - (void)setUpCamera
    {
       _cameraView = [[DSCameraView alloc] initWithFrame:self.view.bounds];
       [_cameraView setAutoresizingMask:UIViewAutoresizingFlexibleWidth];
       [self.view addSubview:_cameraView];
       [_cameraView addSubview:_captureButton];
       _dce = [[DSCameraEnhancer alloc] init];
       [_dce setCameraView:_cameraView];
       DSDrawingLayer * layer = [_cameraView getDrawingLayer:DSDrawingLayerIdDDN];
       [layer setVisible:true];
       // You can enable the frame filter feature of Dynamsoft Camera Enhancer.
       //[_dce enableEnhancedFeatures:DSEnhancerFeatureFrameFilter];
    }
    
  2. var cameraView:CameraView!
    var dce:CameraEnhancer!
    ...
    func setUpCamera() {
       // Create a camera view and add it as a sub view of the current view.
       cameraView = .init(frame: view.bounds)
       cameraView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
       view.insertSubview(cameraView, at: 0)
       // Bind the camera enhancer with the camera view.
       dce = CameraEnhancer()
       dce.cameraView = cameraView
       // Additional step: Highlight the detected document boundary.
       let layer = cameraView.getDrawingLayer(DrawingLayerId.DDN.rawValue)
       layer?.visible = true
       // You can enable the frame filter feature of Dynamsoft Camera Enhancer.
       // dce.enableEnhancedFeatures(.frameFilter)
    }
    

Initialize Capture Vision Router

Once the camera component is set up, declare and create an instance of CaptureVisionRouter and set its input to the Camera Enhancer object you created in the last step.

  • Objective-C
  • Swift
  1. @property (nonatomic, strong) DSCaptureVisionRouter *cvr;
    ...
    - (void)setUpCvr
    {
       _cvr = [[DSCaptureVisionRouter alloc] init];
    }
    
  2. var cvr:CaptureVisionRouter!
    func setUpCvr() {
       cvr = CaptureVisionRouter()
    }
    

Bind your CaptureVisionRouter instance with the created CameraEnhancer instance.

  • Objective-C
  • Swift
  1. - (void)setUpCvr
    {
       ...
       NSError *cvrError;
       [_cvr setInput:_dce error:&cvrError];
    }
    
  2. func setUpCvr() {
       try? cvr.setInput(dce)
    }
    

Set up Result Receiver

  1. Add CapturedResultReceiver to your ViewController.

    • Objective-C
    • Swift
    1. @interface ViewController ()<DSLicenseVerificationListener, DSCapturedResultReceiver>
      
    2. class ViewController: UIViewController, LicenseVerificationListener, CapturedResultReceiver {
         ...
      }
      
  2. Implement onNormalizedImagesReceived method to receive the normalized images as the captured results.

    • Objective-C
    • Swift
    1. // import ImageViewController.h. It will be implemented later.
      #import "ImageViewController.h"
      -(void)onNormalizedImagesReceived:(DSNormalizedImagesResult *)result
      {
         if (result!=nil && result.items[0].imageData!=nil && (_implementCapture || result.items[0].crossVerificationStatus == DSCrossVerificationStatusPassed))
         {
            NSLog(@"Capture confirmed");
            _implementCapture = false;
            dispatch_async(dispatch_get_main_queue(), ^{
               [self.cvr stopCapturing];
               ImageViewController *imageViewController = [[ImageViewController alloc] init];
               NSError * error;
               imageViewController.normalizedImage = [result.items[0].imageData toUIImage:&error];
               NSLog(@"UIImage set");
               [self.navigationController pushViewController:imageViewController animated:YES];
            });
         }
      }
      
    2. func onNormalizedImagesReceived(_ result: NormalizedImagesResult) {
         if let item = result.items?.first {
            if item.crossVerificationStatus == .passed || implementCapture {
               guard let data = item.imageData else { return }
               implementCapture = false
               cvr.stopCapturing()
               DispatchQueue.main.async {
                      let resultView = ImageViewController()
                      resultView.normalizedImage = try? data.toUIImage()
                      self.navigationController?.pushViewController(resultView, animated: true)
               }
            }
         }
      }
      
  3. Add the result receiver to the CaptureVisionRouter.

    • Objective-C
    • Swift
    1. - (void)setUpCvr
      {
         ...
         [_cvr addResultReceiver:self];
         DSMultiFrameResultCrossFilter *filter = [[DSMultiFrameResultCrossFilter alloc] init];
         [filter enableResultCrossVerification:DSCapturedResultItemTypeNormalizedImage isEnabled:true];
         [_cvr addResultFilter:filter];
      }
      
    2. func setUpCvr() {
         ...
         cvr.addResultReceiver(self)
         let filter = MultiFrameResultCrossFilter.init()
         filter.enableResultCrossVerification(.normalizedImage, isEnabled: true)
         cvr.addResultFilter(filter)
      }
      
  4. Add a confirmCapture button to confirm the result.

    • Objective-C
    • Swift
    1. @property (nonatomic, strong) UIButton *captureButton;
      @property (nonatomic) BOOL implementCapture;
      ...
      - (void)addCaptureButton {
         [self.view addSubview:self.captureButton];
      }
      - (UIButton *)captureButton {
         NSLog(@"Start adding button");
         CGFloat screenWidth = [UIScreen mainScreen].bounds.size.width;
         CGFloat screenHeight = [UIScreen mainScreen].bounds.size.height;
         if (!_captureButton) {
            _captureButton = [UIButton buttonWithType:UIButtonTypeCustom];
            _captureButton.frame = CGRectMake((screenWidth - 150) / 2.0, screenHeight - 100, 150, 50);
            _captureButton.backgroundColor = [UIColor grayColor];
            _captureButton.layer.cornerRadius = 10;
            _captureButton.layer.borderColor = [UIColor darkGrayColor].CGColor;
            [_captureButton setTitle:@"Capture" forState:UIControlStateNormal];
            [_captureButton setTitleColor:[UIColor whiteColor] forState:UIControlStateNormal];
            [_captureButton addTarget:self action:@selector(setCapture) forControlEvents:UIControlEventTouchUpInside];
         }
         return _captureButton;
      }
      - (void)setCapture
      {
         _implementCapture = true;
      }
      
    2. var captureButton:UIButton!
      var implementCapture:Bool = false
      ...
      func addCaptureButton()
      {
         let w = UIScreen.main.bounds.size.width
         let h = UIScreen.main.bounds.size.height
         let SafeAreaBottomHeight: CGFloat = UIApplication.shared.statusBarFrame.size.height > 20 ? 34 : 0
         let photoButton = UIButton(frame: CGRect(x: w / 2 - 60, y: h - 100 - SafeAreaBottomHeight, width: 120, height: 60))
         photoButton.setTitle("Capture", for: .normal)
         photoButton.backgroundColor = UIColor.green
         photoButton.addTarget(self, action: #selector(confirmCapture), for: .touchUpInside)
         DispatchQueue.main.async(execute: { [self] in
            view.addSubview(photoButton)
         })
      }
      @objc func confirmCapture()
      {
         implementCapture = true
      }
      

Configure the methods viewDidLoad, viewWillAppear, and viewWillDisappear

  • Objective-C
  • Swift
  1. - (void)viewDidLoad {
       [super viewDidLoad];
       [self setLicense];
       [self setUpCamera];
       [self setUpCvr];
       [self addCaptureButton];
    }
    - (void)viewWillAppear:(BOOL)animated
    {
       [super viewWillAppear:animated];
       [_dce open];
       [_cvr startCapturing:DSPresetTemplateDetectAndNormalizeDocument completionHandler:^(BOOL isSuccess, NSError * _Nullable error) {
          if (!isSuccess && error != nil) {
            NSLog(@"%@", error.localizedDescription);
          }
       }];
    }
    - (void)viewWillDisappear:(BOOL)animated
    {
       [super viewWillAppear:animated];
       [_dce close];
    }
    
  2. override func viewDidLoad() {
       super.viewDidLoad()
       setLicense()
       setUpCamera()
       setUpCvr()
       addCaptureButton()
    }
    override func viewWillAppear(_ animated: Bool) {
       super.viewWillAppear(animated)
       dce.open()
          cvr.startCapturing(PresetTemplate.detectAndNormalizeDocument.rawValue){ isSuccess, error in
             if let error = error, !isSuccess {
                    print("Capture start failed")
                    print(error.localizedDescription)
          }
       }
    }
    override func viewWillDisappear(_ animated: Bool) {
       super.viewWillDisappear(animated)
       dce.close()
    }
    

Display the Normalized Image

  1. Create a new UIViewController class ImageViewController.

  2. Add a property normalizedImage to the header file of ImageViewController (Objective-C only).

    #import <UIKit/UIKit.h>
    
    NS_ASSUME_NONNULL_BEGIN
    
    @interface ImageViewController : UIViewController
    
    @property (nonatomic, strong) UIImage *normalizedImage;
    
    @end
    
    NS_ASSUME_NONNULL_END
    
  3. Configure the ImageViewController to display the normalized image..

    • Objective-C
    • Swift
    1. #import "ImageViewController.h"
      @interface ImageViewController()
      @property (nonatomic, strong) UIImageView *imageView;
      @end
      @implementation ImageViewController
      -(void)viewDidLoad
      {
         NSLog(@"ImageViewController loaded");
         [super viewDidLoad];
         [self setUpView];
      }
      - (void)setUpView
      {
         _imageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
         [self.view addSubview:_imageView];
         [_imageView setContentMode:UIViewContentModeScaleAspectFit];
         dispatch_async(dispatch_get_main_queue(), ^{
            [self.imageView setImage:self.normalizedImage];
         });
      }
      @end
      
    2. import UIKit
      import DynamsoftCore
      import DynamsoftCaptureVisionRouter
      import DynamsoftDocumentNormalizer
      class ImageViewController: UIViewController {
         var normalizedImage:UIImage!
         var imageView:UIImageView!
         override func viewDidLoad() {
            super.viewDidLoad()
            setUpView()
         }
         func setUpView() {
            imageView = UIImageView.init(frame: view.bounds)
            imageView.contentMode = .scaleAspectFit
            view.addSubview(imageView)
            DispatchQueue.main.async { [self] in
               imageView.image = normalizedImage
            }
         }
      }
      
  4. Go to your Main.storyboard and add Navigation Controller.

 

Configure Camera Permissions

Add Privacy - Camera Usage Description to the info.plist of your project to request camera permission. An easy way to do this is to access your project settings, go to Info and then add this Privacy property to the iOS target properties list.

Additional Steps for iOS 12.x or Lower Versions

If your iOS version is 12.x or lower, please add the following additional steps:

  1. Remove the methods application:didDiscardSceneSessions: and application:configurationForConnectingSceneSession:options: from your AppDelegate file.
  2. Remove the SceneDelegate.Swift file (SceneDelegate.h & SceneDelegate.m for Objective-C).
  3. Remove the Application Scene Manifest from your info.plist file.
  4. Declaire the window in your AppDelegate.Swift file (AppDelegate.h for Objective-C).

    • Objective-C
    • Swift
    1. @interface AppDelegate : UIResponder <UIApplicationDelegate>
      @property (strong, nonatomic) UIWindow *window;
      @end
      
    2. import UIKit
      @main
      class AppDelegate: UIResponder, UIApplicationDelegate {
         var window: UIWindow?
      }
      

Build and Run the Project

  1. Select the device that you want to run your app on.
  2. Run the project, then your app will be installed on your device.

Note:

  • You can get the source code of the HelloWorld app from the following link

This page is compatible for:

Is this page helpful?

YesYes NoNo

In this article: