Augmented Reality Aided Inspection of Gears

For precisely produced polymer gears with a fast turnaround, reliable and easy inspection is crucial. The research project includes determining geometrical parameters and proposes a new parameter, which includes the absolute deviation of the gear from the designed model. The new parameter can be used to determine the shrinkage of the polymer gears and is used as feedback to the design process through augmented reality aided inspection. The inspection process begins with acquiring an optical scan, which captures the geometry of the whole gear. It includes a comparison to the CAD model. Using model-based definition, the measures and tolerances from the design phase are transferred to the inspection through STEP AP 242. During the inspection, the gear is evaluated if it is in accordance with the prescribed tolerances. The results are used as feedback to the design process through augmented reality, which enables a clear presentation of the results on the actual object. The presentation gives a direction where big shrinkage occurs and shows how much the mould design needs to be changed. The results include the colour map of the deviation, standard geometrical parameters, the shrinkage parameter, and the measure/tolerance check.


INTRODUCTION
Gears are essential elements in engineering. Ensuring their quality and ensuring optimal parameters of manufacturing is, therefore, crucial [1,2]. An accurate, holistic, and simple method of inspection is needed for accurately produced gears. To ensure comparable inspection of gears, some parameters must be determined, calculated and evaluated per the available standards [3,4] after manufacturing and before final assembly. That involves a method for attaining the parameter values and defining their quality grades.
Standardized geometrical parameters allow for accurate and regulated gear inspection, but existing inspection methods only provide a limited range of gear measurements at particular locations. The limited evaluation is prevalent in current evaluation procedures [5]. Therefore, a method to obtain holistic threedimensional (3D) measurements with optical inspection was thoroughly investigated in a previous project [6,7]. 3D optical scanning was utilised to acquire the measurement data. The developed software was then used to process and evaluate the data.
A previously developed method [6] and software were upgraded to also include an automated transfer of information through product manufacturing information and with model-based definition, and to include inspection with augmented reality. The method was also upgraded to include more parameters, which describe the total deviation of the geometry so that it does not only evaluate comparative differences.

Model-based Definition
The automated transfer of information from the model to the inspection procedure was done with Model-Based Definition (MBD), which is a modern method for handling engineering processes utilising 3D models as comprehensive foundations of information [8].
It has become the rudimentary information carrier [9] between computer-aided design, computer-aided production, and computer-aided quality management due to the strong characteristics of representation and digitization.
With the developments in manufacturing and information technology, the time available for new product manufacturing and its qualitative monitoring is getting shorter [10]. Simultaneously, the quality requirements, intricacy of products, and necessary geometric tolerances are increasing [11]. New strategies to minimize the phase to market and to increase savings are necessary. Savings can be expressed as a combination of direct savings such as reduced development time, rework, man-hours and indirect savings, such as improved knowledge management [12]. The additional information is stored in Product Manufacturing Information (PMI) that is relayed by MBD, which also contains solid models and associated metadata. Annotations and characteristics that describe product geometry and requirements are included in the PMI [13]. ASME Y14.41-2012 [14] that serves as the basis for the international standard ISO 16792, is the industry norm for displaying GD&T in an axonometric view in 3dimensional space. ISO STEP AP 242 [15] standardizes Managed Model-based 3D Engineering. Other neutral formats are available; however, the STEP format is the optimal choice for data transfer [16]. There are various different approaches to how a STEP file can express PMI data. The PMI data in the semantic representation is stored in a machine-readable manner. It allows for automatic data consumption, updating, manufacturing, calculation, inspection, information re-use, design updates, and other downstream applications. Only Application Protocol 242 enables the semantic representation of PMI. The other approach is to show PMI data graphically, which displays the information in a human-readable manner. Fig. 1 displays a model of a gear with semantic written PMI.
To acquire a holistic geometry of the measured gear, the measurement was done with an optical inspection method.

Optical Inspection Methods
Gear measurements can be completed using contact [17] and non-contact methods. Tactile methods are accurate; they are relatively slow, however. Non-contact methods enable quick point acquisition and are being used increasingly. A common optical method for getting a point cloud is with interferometric sensors, which are precise but slow [18]. Laser line triangulation is faster and frequently used [19]. Conversely, it is limited with its accuracy [20], so it is most often used with larger objects. It is also the most frequently used method for on-line measurement. In the presented research, a structured light method was employed for the optical measurements of the manufactured gears. Structured light scanning projects a known pattern on to a part. The way the pattern deforms when striking the surface informs the system of the depth and surface information. That offers fast measurements; however, it is more susceptible to lighting conditions. The study used an ATOS Compact SCAN 5M, by GOM. The accuracy of the system is roughly 2 μm. Fig. 2 demonstrates how the gear measurement was taken. The gear was coated with a scanning powder and reference points were placed on it. It was then rotated on a turntable with multiple scans being taken during the rotation. The scans were then stitched together and exported as an STL file.
The evaluation workflow is illustrated in Fig. 3. The gear is firstly scanned to acquire the STL of the geometry. The STL is then imported to GOM Inspect [21] along with the theoretical model. There the models are aligned, and the measured data is cut into sections. The sections are then imported to the developed Python software. There, the deviations are assessed, including the PMI from the theoretical model. The quality parameters and the tolerance evaluation can then be exported as an excel report.
Besides evaluating physical parts, this method is also useful in evaluating the results of a plastic injection moulding simulation. There the measurement with CMMs is not possible. The simulation results include an STL file of the finished part, which can be evaluated with the developed method. However, the method does need to include new parameters that include the absolute differences.

Expanding the Evaluation with More Parameters
The technique is useful for assessing polymer gears that are produced by injection moulding because of the significant variations that can occur there. It was proven that the quality grades can be accurately determined [7], however, standard quality parameters do not take into account large scale deviations. They currently only evaluate relative differences of the geometry. New parameters need to be developed to also include absolute differences.

Augmented Reality Aided Inspection
The method of quality inspection of gears can be greatly upgraded and extended with the addition of augmented reality (AR). The determined parameters, PMI evaluation, shrinkage parameters and a visual representation of deviation overlaid on the actual gears can greatly improve the designers understanding of the deviations that occur on the manufactured gears. The advantage of reviewing the deviations overlaid on the gear over inspecting the gears with the help of a computer screen is that the designer or quality inspection engineer does not need to switch between viewing the actual gear and the deviations to the theoretical gear shown elsewhere. Now that a lot of previous AR limitations are being solved, in cases of industrial use, the technology is more and more viable. The use of AR is also accelerated by the experiences gained with VR technologies [22]. Some of the use cases are knowledge transfer [22], ensuring heritage safety [23], quality control [24], remote maintenance [25], maintenance [26], training simulators [27], and visualization of computer-aided manufacturing commands [28]. There have also been advances in optimisation of text usage and the translation of text commands into 2D symbols for the use case of AR-based manuals. Experiments validated that such manuals are clearer than other manuals [29]. There have also been efforts in creating a vocabulary of graphical symbols to be used in AR maintenance instructions [30].
In some use cases, VR technology is more appropriate than AR. This is the case for the use of simulations, but in fields such as maintenance and healthcare, AR, in which virtual data can be superimposed over a real-world context, is better because it allows contact with the real environment. AR aims to enhance reality, whereas VR aims to substitute it [31]. The simulation of CNC machining, which requires users to devote a lot of time on modelling the machining environment, including all of the various machining instructions is an example where AR is better suited than VR [32]. By conducting simulation in the true machining environment, AR circumvents this problem. Therefore, AR is helpful when the consumer, the physical object and the virtual model have a lot of interaction.
Currently, not a lot of research has been done on using AR for inspection. A model for an augmented reality assisted inspection was set in the article [33]. The transfer of PMI to AR was studied and a method for AR assisted quality inspection was developed.

Research Goals
The relevant research mainly focuses on maintenance and assembly-related applications. When considering the case of integrating MBD and AR into inspection, there is still a considerable research gap; and methodologies and guidelines need to be developed according to the industry 4.0. It is also difficult to introduce AR aided inspection into the enterprise because the necessary prerequisites and criteria are unavailable.
The goal is to enhance the previously developed method with AR and added parameters, to empower the designer with more information. This will make it easier for the designer to make informed decisions. Additionally, the goal is to integrate the use of PMI from CAD models in augmented reality. The transfer of data is preferable with native formats or with ISO STEP AP242.

IMPLEMENTATION OF THE NEW QUALITY PARAMETER
Distinctive points on the gear, such as the root circle diameter, gear width, and addendum circle diameter, were considered by the new parameter and its integration in the application, to determine the difference from the theoretical geometry. In that way, a percentage change that needs to be done to the model can be suggested to the designer. In the case of plastic injection, the designer can increase or decrease the mould design by a certain percentage.
Where the changes are necessary can be shown with augmented reality, which displays the differences to the designed more visibly and can enable further developments of the method to include suggestions to the user.

DEVELOPING THE AUGMENTED REALITY ENHANCED INSPECTION
Devices can be handheld and head-mounted (HMD). The advantage of HMDs is that the use is hands-free and the user can do the required task easily. A significant limitation of most modern VR headsets is their need to be tethered to computers. There have been some advancements in tether-less headsets, however, they normally cannot support engineering applications. AR headsets, however, are all developed to be mobile and tetherless. One of the more capable headsets is Microsoft HoloLens, which is already seeing a lot of uses in enterprises [34]. Different devices also offer different input options. Some can use external devices such as hand controllers and other wearables. Others use gaze controllers. The chosen augmented reality HMD was Microsoft HoloLens 2, as it is currently the most advanced. It offers real-world scanning, voice inputs, a wide field of view, and gesture recognition.
An important part of the AR application is the positioning of the augmented content over the real environment. Accurate positioning is achieved through tracking [22]. Most ways of tracking are vision-based. Normally, the application has prior knowledge of what to track and the tracking can be based on models, features or markers. The most robust and reliable tracking is most often model-based, however, the model is a requirement. Another robust tracking approach is marker-based, but it is cumbersome and frequently avoided. The developed method uses model-based tracking, which was achieved with Vuforia Engine 9.2. The model is already required for processing the deviations so it is already available for tracking. Vuforia is integrated into the game engine Unity 2018.3, which was used to develop the application and to add additional information.
The process of creating augmented content and applications is called authoring [22]. There are a few distinct types of solutions for publishing. The manual method is the most common one. The application, which requires the development of models and their integration in AR, is generated manually. The automatic approach is a more pragmatic and convenient one. Such techniques are designed to automatically produce enriched content from available data without the need for guidance from professionals or creators. The AR aided inspection method was developed with the manual authoring process. The workflow required for transferring the inspection results to the AR application is displayed in Fig. 4. The CAD model and the inspection result of the evaluated deviations were aligned to achieve the model-based tracking and to overlay the deviations. The alignment accuracy is important, as the model is supposed to be superimposed over the actual gear. It is possible to import the graphical representation of the PMI into the Unity game engine with the PiXYZ plugin. However, the semantic import of PMI is still not available [33]. The Unity software enables the creation of additional elements. An approach would be to recognise the graphical presentation of PMI and recreate it in a semantic representation. The creation procedure of the application is shown in Fig. 4. It is created manually, however, there are many opportunities for automating the authoring process. Information, such as text and 3D models could be automatically extracted from an authoring database.
Context-Awareness [22] is an additional contentrelated method. To change augmented output, it attempts to use contextual knowledge. In other terms, according to data gathered from the context, it adjusts content already generated. It is possible to provide one or more context variables in the application. An example of a contextawareness technique is shown in [35], where during the 3D printing process, the applications adapt the amount of the shown 3D model.
The process of implementing the augmented reality aided inspection is shown in Fig. 4. The stl file, which is the result of the simulation or the measurement contains the geometry that needs to be evaluated. This is then imported to GOM Inspect and aligned to the desired geometry. There, a comparison of geometries is made as a deviation surface plot. The result is exported in a ply format. However, the ply format has the deviation values and colours stored in vertexes and this need to be transformed to be stored in the faces. This is done in MeshLab 2016. However, MeshLab cannot export the results in the fbx format, which is the native file of the Unity software which is used to build the application. The transformation from obj to fbx file is done in Blender. The fbx file, the deviation scale, and the PMI and geometric elements evaluated in the developed software are then imported to Unity where the application is created. Vuforia is used to create the modelbased tracking for AR application. The process of converting between individual formats needs to be optimised, as it includes many unnecessary steps.

RESULTS AND DISCUSSION
The results and the application are shown in Fig. 5 and Fig. 6. The images illustrate the view of the designer while evaluating the gear. The AR application shows the deviation plot, the deviation scale, the evaluated PMI, the common geometrical quality parameters, and the implemented shrinkage parameter, which suggests to the designer how much the mould design needs to be changed.
The information is presented independently of the analysed component using static 2D/3D elements and with icons displayed on the screen. Fig. 5 and Tab. 1 present the information of the determined shrinkage parameter and the PMI evaluation. It is evident from the results, that the gear has shrunk 0,21 % in the radial direction. The designer armed with this knowledge can easily correct the design of the mould and determine where the biggest deviations occurred. The PMI evaluation shows if the gear width and gear hole diameter are within the prescribed tolerance values. The information taken from the MBD model offers an automated tolerance check and enriches the inspection. The gear width, with the value 19,815 mm, is within tolerance, whereas the gear hole diameter, with the value 15,773 mm is not.   Fig. 6 more clearly presents the deviations and the scale. Two big deviations, which need to be addressed, can be seen on the teeth: a lead profile deviation and a pitch deviation. Other large deviations can also be seen on the roots of the teeth, mainly because the scanning conditions there are poorer. Tab. 2 represents the quality grades of the gear geometrical parameters. These are calculated from the deviations on the teeth and range from 0 to 12, where 12 is the worst quality grade.  Fig. 6 presents a large lead profile deviation, which is reflected in a worse quality grade in Tab. 2 for the parameters F β , f f,β , and f H,β . A large pitch deviation is also present, which results in a quality grade 12 for the single pitch deviation parameter.

CONCLUSION AND OUTLOOK
In the paper, a previously developed inspection method was upgraded with the principles of model-based definition and augmented reality. New parameters were also included to aid the designer.
The benefit to the designer is a clearer presentation of deviations with augmented reality, where the information can be overlaid on the actual gear that is being inspected. The inspection process is enriched with the automatic evaluation of the transferred PMI from the designed model, and with a new parameter that evaluates the large-scale deviations or the shrinkage of the injection-moulded gears. The mould design can then be adjusted according to the results of the evaluation. There are some applications, where model-based tracking is not possible, because of a lack of distinct features. There, the results can be displayed freely on the digital model of the gear.
An archive and workflow should be built in further work to enable the automated creation of augmented reality applications. The pipeline to get the inspection result to AR is quite substantial, and there would be a big contribution in automating it. More parameters to evaluate the nominal deviation to the required geometry should be introduced to evaluate the gear with more confidence in the case of injection moulding. The transfer of semantic PMI to augmented reality is still an open question and would need to be further studied. However, the evaluation of the PMI was included in the developed software. In order to validate and analyse the proposed approach and to determine the impact on user results, more work should be undertaken.