Smart Parking System Report [PDF]

  • Author / Uploaded
  • gari
  • 0 0 0
  • Gefällt Ihnen dieses papier und der download? Sie können Ihre eigene PDF-Datei in wenigen Minuten kostenlos online veröffentlichen! Anmelden
Datei wird geladen, bitte warten...
Zitiervorschau

IED – 2053 Engineering Team Project

Project Report [Smart Parking System]

Team2

FutureWill

Abstract:

2

Content: Abstract: .......................................................................................................................... 2 Introduction: .................................................................................................................... 4 Background: .................................................................................................................... 5 Related Work: .................................................................................................................. 5 Experimental Section: ...................................................................................................... 5 Design: ............................................................................................................................ 6 Schematics: ............................................................................................................................. 6 Flowchart: .............................................................................................................................. 7 Targeted performance: ........................................................................................................... 8 Challenges: ............................................................................................................................. 8

Implementation: ............................................................................................................... 9 Website: .................................................................................................................................. 9 Python code: ......................................................................................................................... 10 Bash Scripts:......................................................................................................................... 15 OS configuration: ................................................................................................................. 16

Test Results: ................................................................................................................... 17 Performance: ........................................................................................................................ 17 Product datasheet ................................................................................................................. 19 TRL Level............................................................................................................................. 20 Improvements: ..................................................................................................................... 20

Discussion: .................................................................................................................... 20 Conclusion:.................................................................................................................... 21 References: .................................................................................................................... 21

3

Introduction:

4

Background:

Related Work:

Experimental Section:

5

Design: Schematics:

Figure 1:circuit schematic

Figure 2: GPIO map

The hardware components of the project consist of:

# 1 2 3 4 5

Part name

Description

Raspberry Pi 3 Model B+

Computing and networking unit

Resistors Ultrasonic Ranging Module HC - SR04 LEDs cameras

4X 330 , 2X 2K , 2X 1K Sensor to find approaching distance 2X Red, 2X green Capture vehicle plate number

Table 1: Hardware Components

Raspberry PI is a single-board computer than contains multiple USB connection and network connectivity. The PI comes with quadcore 1.2 GHz processor, 1Gb of RAM, 40X pins for general purpose input/output control (figure1). Each LED will require a port on the GPIO for power. Ultrasonic sensors require 4 ports: 𝑉𝑐𝑐 , Trigger, Echo and ground. Cameras will utilize 2 USB ports out of 4 in the PI. Choosing resistors values for ultrasonic sensor in a voltage divider circuit to control and drop the voltage going to the pin of GPIO down to 3.3V from 5V. The ultrasonic sensor has a range of 2cm up to maximum 4m which covers the appropriate distance for the camera to take a clean picture. The measuring angle is 15 degree.

6

Main feature of the camera is sensor quality, the selection of camera type was restricted by the amount of investment the team had. Going with basic functionality was the only option to have the project working within price range. The cameras have 300K CMOS sensor, 24bit true color and 8 megapixel.

Flowchart:

Figure 3: system work flow

The flow chart for the project starts with registering a plate number in the website and paying in advance for the entrance. Once the registered car approaches the parking it will be 7

detected by the ultrasonic sensor and signal to the camera to capture an image of the plate number. The central computing unit analysis the image and extract a string of text containing the plate number of the vehicle and compare it to a list of all registered plate numbers already registered in the system. LEDs will show if the plate number is recognized and send a signal to the barrier (depending on the parking lot method of controlling traffic in the lot). Exit node at the gate will run the same flow once the vehicle approaches the exit gate. Analyzing the plate and calculating used time, this output would be sent back to the website to apply the charges per the parking policy and log the activity in the user profile.

Targeted performance: The main idea of this product to have an easy and quick method to access a parking lot. Calculated time for a user to access a parking with an ID card was on average 9.66 seconds (tests were carried at Bangor School of Computer Science and Electronic Engineering with staff members accessing parking area with a card reader). The experiment showed multiple users taking long time to locate their access card while others had multiple false reading from the NFC reader. The expected performance from the product to be faster than usual NFC card readers and eliminate human interaction with hardware or expose the driver to the elements. Optical Character Recognition (OCR) expected to be quick averaging (146.8 milliseconds). Overall performance of the system to be less than 7 seconds including network latency.

Challenges: Challenges in the design were related to quality of the image, network latency and successful plate recognition. For image quality, the camera to be used should have Adequate lighting for capturing a clear image, it should have auto exposure feature along with auto shutter speed to adjust to different time of day. Angle of capture is important to avoid image distortion and inaccurate plate recognition. Pixel density of the sensor helps recognize the plate faster and more accurately. Image setting like resolution and white balance is a factor in a successful recognition. For an OCR system, having enough CPU resources helps decrease total processing time of the image. Hosting the OCR system locally on a raspberryPI with ARM processor could lead to unwanted delays. By using a cloud API, reliable fast network connection is essential to meet response time.

8

Implementation: Website: A website is an essential method to reach customers easily. In order to design, configure and implement a website, a platform had to be selected for hosting and connecting user data to the systems. For the purpose of this project the website is hosted locally on the RaspberryPi by using various packages. XAMPP is an Apache distribution which contains all required packages to host a website locally. MariaDB , Perl and PHP were installed and configured as core packages. Initially, Bitnami software containing Wordpress binaries was installed on top of XAMPP then removed due to compatibility issues with Kernel level of the OS. Manually installed Wordpress binaries was done, adding required plugins then modifying the code to give us the required functionality was done after that.

Figure 4: new user registration on the site

Once user is registered with requested mandatory fields, a profile view with multiple options is provided to the user (figure). Administrators of the website have a profile viewer with more control on user profiles and all the data regarding a user(figure).

9

Figure 6: administration view of user profile

Figure 5: user profile view

Python code: The main functionality of the project resides in the python code, for the two parts of the system (entrance and exit gates) different scripts were created to simplify the process and do in-depth debugging. The codes were created using python version 3.4 which was preinstalled in the OS of raspberry Pi. The code starts with importing all necessary libraries to control: Input/Output pins, subprocesses and invoking other scripts within the code, calling database and parsing JSON. Different variables were defined to control the hardware PIN POWER GND GPIO 4 GPIO 10 GPIO 18 GPIO 22 GPIO 23 GPIO 24 GPIO 25 GPIO 27

FUNCTION 5v power Multiple ground terminals Trigger port for ultrasonic1 Echo port for ultrasonic1 Echo port for ultrasonic2 Red LED Green LED Trigger port for ultrasonic1 Red LED Green LED

Table 2: pins and functions of GPIO on PI

1. #!/usr/bin/python3.4 (entry_node.py) 2. 3. #Libraries 4. import RPi.GPIO as GPIO 5. import time 6. import subprocess 7. import requests 8. import base64 9. import json 10. import datetime 11.

10

12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 72.

#time control x = datetime.datetime.now() #led control GPIO.setmode(GPIO.BCM) GPIO.setwarnings(False) GPIO.setup(22,GPIO.OUT) GPIO.setup(23,GPIO.OUT) #GPIO Mode (BOARD / BCM) GPIO.setmode(GPIO.BCM) #set GPIO Pins GPIO_TRIGGER = 18 GPIO_ECHO = 24 #set GPIO direction (IN / OUT) GPIO.setup(GPIO_TRIGGER, GPIO.OUT) GPIO.setup(GPIO_ECHO, GPIO.IN) def distance(): # set Trigger to HIGH GPIO.output(GPIO_TRIGGER, True) # set Trigger after 0.01ms to LOW time.sleep(0.00001) GPIO.output(GPIO_TRIGGER, False) StartTime = time.time() StopTime = time.time() # save StartTime while GPIO.input(GPIO_ECHO) == 0: StartTime = time.time() # save time of arrival while GPIO.input(GPIO_ECHO) == 1: StopTime = time.time() # time difference between start and arrival TimeElapsed = StopTime - StartTime # multiply with the sonic speed (34300 cm/s) # and divide by 2, because there and back distance = (TimeElapsed * 34300) / 2 return distance if __name__ == '__main__': try: while True: dist = distance() print ("Measured Distance = %.1f cm" % dist) time.sleep(1) converted = int(dist) measured = 10 if converted < measured: subprocess.call(['/home/pi/webcam.sh']) print "LED on .. captured image" enter_time =open("time.txt", "w")

11

73. 74. 75. 76. 77. 78. 79. 80. 81.

enter_time.write(now.strftime("%Y-%m-%d %H:%M")) enter_time.close() time.sleep(2) filename = '/home/pi/plates.txt'

def check(plate_number): datafile = open(filename, 'r') found = False for line in datafile: #checking numberplate in each line of file 82. if plate_number in line: 83. datafile.close() 84. return True 85. datafile.close() 86. return False 87. 88. 89. IMAGE_PATH = '/home/pi/image2.jpg' 90. SECRET_KEY = 'sk_6da8e710ca13a1ebba9b7d38' 91. 92. 93. with open(IMAGE_PATH, 'rb') as image_file: 94. img_base64 = base64.b64encode(image_file.read()) 95. 96. url = 'https://api.openalpr.com/v2/recognize_bytes?recognize_vehicle=1&coun try=eu&secret_key=%s' % (SECRET_KEY) 97. r = requests.post(url, data = img_base64) 98. 99. 100. #print(json.dumps(r.json(), indent=2)) 101. answer = r.json() 102. plate_number = answer['results'][0]['plate'] 103. print(plate_number) 104. 105. result = check(plate_number) 106. 107. if result == True: 108. print('found') 109. GPIO.output(23,GPIO.HIGH) 110. time.sleep(0.5) 111. GPIO.output(23,GPIO.LOW 112. else: 113. print('Not found') 114. GPIO.output(22,GPIO.HIGH) 115. time.sleep(0.5) 116. GPIO.output(22,GPIO.LOW) 117. 118. 119. 120. # Reset by pressing CTRL + C 121. except KeyboardInterrupt: 122. print("Measurement stopped by User") 123. GPIO.cleanup( 1. 2. 3. 4.

#!/usr/bin/python3.4 (exit_node.py) #Libraries import RPi.GPIO as GPIO import time

12

5. import subprocess 6. import requests 7. import base64 8. import json 9. import datetime 10. 11. #time control 12. x = datetime.datetime.now() 13. 14. #led control 15. GPIO.setmode(GPIO.BCM) 16. GPIO.setwarnings(False) 17. GPIO.setup(25,GPIO.OUT) 18. GPIO.setup(27,GPIO.OUT) 19. 20. 21. #GPIO Mode (BOARD / BCM) 22. GPIO.setmode(GPIO.BCM) 23. 24. #set GPIO Pins 25. GPIO_TRIGGER = 4 26. GPIO_ECHO = 22 27. 28. #set GPIO direction (IN / OUT) 29. GPIO.setup(GPIO_TRIGGER, GPIO.OUT) 30. GPIO.setup(GPIO_ECHO, GPIO.IN) 31. 32. def distance(): 33. # set Trigger to HIGH 34. GPIO.output(GPIO_TRIGGER, True) 35. 36. # set Trigger after 0.01ms to LOW 37. time.sleep(0.00001) 38. GPIO.output(GPIO_TRIGGER, False) 39. 40. StartTime = time.time() 41. StopTime = time.time() 42. 43. # save StartTime 44. while GPIO.input(GPIO_ECHO) == 0: 45. StartTime = time.time() 46. 47. # save time of arrival 48. while GPIO.input(GPIO_ECHO) == 1: 49. StopTime = time.time() 50. 51. # time difference between start and arrival 52. TimeElapsed = StopTime - StartTime 53. # multiply with the sonic speed (34300 cm/s) 54. # and divide by 2, because there and back 55. distance = (TimeElapsed * 34300) / 2 56. 57. return distance 58. 59. if __name__ == '__main__': 60. try: 61. while True: 62. dist = distance() 63. print ("Measured Distance = %.1f cm" % dist) 64. time.sleep(1) 65. converted = int(dist)

13

66. 67. 68. 69. 70. 71. 72. 73. 74. 75. 76.

measured = 10 if converted < measured: subprocess.call(['/home/pi/webcam2.sh']) print "LED on .. captured image" time.sleep(2) exit_time =open("time2.txt", "w") exit_time.write(x.strftime("%H\n%M")) exit_time.close() filename = '/home/pi/plates.txt' subprocess.check_call(["python3.7", "/home/pi/time.py"])

77. 78. 79. 80. 81.

def check(plate_number): datafile = open(filename, 'r') found = False for line in datafile: #checking number plate in each line of file 82. if plate_number in line: 83. datafile.close() 84. return True 85. datafile.close() 86. return False 87. 88. 89. 90. # OPENALPR 91. IMAGE_PATH = '/home/pi/image222.jpg' 92. SECRET_KEY = 'sk_6da8e710ca13a1ebba9b7d38' 93. 94. 95. 96. with open(IMAGE_PATH, 'rb') as image_file: 97. img_base64 = base64.b64encode(image_file.read()) 98. 99. url = 'https://api.openalpr.com/v2/recognize_bytes?recognize_vehicle=1&coun try=eu&secret_key=%s' % (SECRET_KEY) 100. r = requests.post(url, data = img_base64) 101. 102. 103. 104. #print(json.dumps(r.json(), indent=2)) 105. answer = r.json() 106. plate_number = answer['results'][0]['plate'] 107. print(plate_number) 108. 109. result = check(plate_number) 110. 111. if result == True: 112. print('found') 113. GPIO.output(25,GPIO.HIGH) 114. time.sleep(0.5) 115. GPIO.output(25,GPIO.LOW) 116. 117. 118. else: 119. print('Not found') 120. GPIO.output(27,GPIO.HIGH) 121. time.sleep(0.5)

14

122. 123. 124. 125. 126. 127. 128. 129.

GPIO.output(27,GPIO.LOW)

# Reset by pressing CTRL + C except KeyboardInterrupt: print("Measurement stopped by User") GPIO.cleanup()

The code below used to calculate how long did the customer stay in the parking and can be charged based on the hourly rate of the lot. A call to this code (time.py) is impeded into the exit gate code (exit_node.py) to calculate the time once the customer is registered and the car identified correctly at the gate. By importing two timestamp logs from entry and exit codes, a calculation is done by parsing string to integer type. A variable (difference) could be sent to the website or payment gateway to charge the customer accordingly. 1. #!/usr/bin/python3.4 (time.py) 2. 3. with open ("time.txt","r") as myfile: 4. hrs = int(myfile.readline().rstrip('\n')) 5. mns = int(myfile.readline(2)) 6. 7. 8. with open("time2.txt","r") as myfile2: 9. hrs2 = int(myfile2.readline().rstrip('\n')) 10. mns2 = int(myfile2.readline(2)) 11. 12. 13. t1= hrs*60+mns 14. t2 = hrs2*60+mns2 15. 16. difference = t2 - t1 17. 18. print("customer spent " +str(difference) +" minutes.") 19. 20. myfile.close() 21. myfile2.close()

Bash Scripts: This code is used to fetch users plate number from mysql database and insert it into a new file. Applying filters using (awk) function to only select required part of the output and store the data in a new file named clean.txt. for the final version of the plate numbers file, appending only newly add clean data to the final plate numbers record plates.txt. 1. #!/bin/bash (mysql query to export plate numbers) 2. mysql -u root -e "SELECT * FROM wp_usermeta WHERE meta_key = 'car_registration_number' " wordpress > ~/test.txt 3. cat ~/test.txt | awk 'FNR >1 {print $4} ' > ~/clean.txt 4. cd ~ 5. fgrep -vxf latest.txt clean.txt >> plates.txt 6. #cat latest.txt

15

In order to use the camera control drivers installed into Linux in the python code, using a separate file for each camera to be called by python code once the sensor detected a movement within range. Using command fswebcam to control the capture of an image along with specified options for image size. 1. #!/bin/bash (camera command to capture image) 2. 3. fswebcam -d /dev/video1 --no-banner -r 1024x768 ~/image222.jpg

OS configuration: Having Linux as the operating system of the Raspberry Pi gives a lot of flexibly in running the system efficiently. Using (Run-Level 3) as default mode allows the network interface services to be configured and start automatically with the boot of the system. By disabling all un-necessary services from this run-level and setting required services to auto start, a more optimized OS can be achieved. Bash scripts were scheduled to run automatically every minute and will autorun at boot as well by using crontab command. Python codes will run in the background in an infinite loop and could be stopped by business requirement or off working hours. Further improvements could be made by disabling GUI elements and customizing snapshot to run in CLI mode. Hosting the website locally prevented this optimization to be made as GUI interface is required for theme control and faster database configuration.

Figure 7: Top command shows CPU utilization

Figure 8: all required services enabled

16

Test Results: Performance: Test methodology focused mainly on optimizing total processing time for the system. The plan meant to test different aspects of the design: -

angle of camera distance for the sensor processing time for cloud API vs local API image quality accuracy of detection

for camera angle: by using a gyroscope with accelerometer in a phone, multiple images were taken and processed to find best camera position. The result showed best confidence (94.75%) with camera angle between 0-45 degrees, while more aggressive angles had either incorrect detection or more total processing time depending on API used.

Figure 9: different angles for camera position

Testing cloud API against local API showed huge advantage for could API as the latter has been trained to detect explicit plate types and been optimized to fix angle distortion which was proven to have directly to correctness of the detection. Local API in an untrained state couldn’t recognize one image and took more than 3 seconds total processing time.

17

Figure 10: processing time in milliseconds

#

SIZE

IMG1 IMG3 IMG4 IMG5 IMG6

52kb 64kb 840kb 436kb 385.6kb

RESOLUTION 900*506 590*350 2560*1600 1800*1200 1024*768

Figure 11:total processing time in milliseconds

PT LOCAL 222.381ms 149.613ms 102.483ms 169.414ms -

PT CLOUD 207.500ms 114.722ms 72.252ms 150.971ms 254.460ms

TT LOCAL 968.807ms 406.293ms 2940.89ms 3003.09ms -

TT CLOUD 678.646ms 409.302ms 1280.35ms 1467.99ms 265.621ms

Table 3: API testing with different images

Using cameras purchased for this project showed confidence of 94.73% in total processing time of 175.53ms. total time of the operation was 5.25 seconds on entry node which includes uploading image of size 420.86KB to the cloud on a 40Mb/s internet connection. The exit node showed similar results with total operation time of 5.47 seconds.

1. { 2. 3. 4. 5. 6. 7. 8. }

"matches_template": 0, "plate": "M90DGR", "confidence": 94.72909545898438% "processing_time": { "plates": 123.39320373535156, "total": 175.53499998757616},

18

Product datasheet:

Figure 12: RaspberryPi power

Figure 13: Ultrasonic sensor specs

Figure 14: stress test 100% CPU utilization (71.04 C max temperature)

Figure 15: power consumption full load 100%

Figure 16: power consumption normal operation

19

TRL Level: The product currently at entry stages of level 5. The prototype is under development, functionalities have been tested in a relevant environment but isn’t advanced enough to be demonstrated on the ground. In order to move to next level, more development and optimization has to be done.

Figure 17: breadboard and fitment testing

Improvements: As a proof of concept, the product in its current state is working as intended and considered a success but with TRL 5 more improvements have to be done. The hardware is the substantial bottleneck for the system. Adapting more capable cameras with better sensors will grant for smaller image size in better quality that yields to faster processing time. Codes could be improved to make them more efficient in utilizing CPU time and have an advanced error checking capability. Developing a mobile application to provide more value to customers and better connecting with them.

Discussion:

20

Conclusion:

References:

21