hig.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • sv-SE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • de-DE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Robustness of a neural network used for image classification: The effect of applying distortions on adversarial examples
University of Gävle, Faculty of Engineering and Sustainable Development, Department of Industrial Development, IT and Land Management, Computer science.
2018 (English)Independent thesis Basic level (professional degree), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

Powerful classifiers as neural networks have long been used to recognise images; these images might depict objects like animals, people or plain text. Distorted images affect the neural network's ability to recognise them, they might be distorted or changed due to distortions related to the camera.Camera related distortions, and how they affect the accuracy, have previously been explored. Recently, it has been proven that images can be intentionally made harder to recognise, an effect that last even after they have been photographed.Such images are known as adversarial examples.The purpose of this thesis is to evaluate how well a neural network can recognise adversarial examples which are also distorted. To evaluate the network, the adversarial examples are distorted in different ways and thereafter fed to the neural network.Different kinds of distortions (rotation, blur, contrast and skew) were used to distort the examples. For each type and strength of distortion the network's ability to classify was measured.Here, it is shown that all distortions influenced the neural network's ability to recognise images.It is concluded that the type and strength of a distortion are important factors when classifying distorted adversarial examples, but also that some distortions, rotation and skew, are able to keep their characteristic influence on the accuracy, even if they are influenced by other distortions.

Place, publisher, year, edition, pages
2018. , p. 25
Keyword [en]
LeNet, Distorted Images, MNIST, Adversarial Examples
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:hig:diva-26118OAI: oai:DiVA.org:hig-26118DiVA: diva2:1181511
Subject / course
Computer science
Educational program
Högskoleingenjör
Supervisors
Examiners
Available from: 2018-02-09 Created: 2018-02-08 Last updated: 2018-02-09Bibliographically approved

Open Access in DiVA

fulltext(585 kB)9 downloads
File information
File name FULLTEXT01.pdfFile size 585 kBChecksum SHA-512
775e15102dae56d41b3a321199d8403635bd19adae39bd292a4dcc01f572ceccecbbefbe412525630a14573de5241e2d158ab54e6a3385ba30ffb565f9ba5bbc
Type fulltextMimetype application/pdf

By organisation
Computer science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 9 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 23 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • sv-SE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • de-DE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf