Wayback Machinekoobas.hobune.stream
May JUN Jul
Previous capture 12 Next capture
2021 2022 2023
1 capture
12 Jun 22 - 12 Jun 22
sparklines
Close Help
  • Products
  • Solutions
  • Made with Unity
  • Learning
  • Support & Services
  • Community
  • Asset Store
  • Get Unity

UNITY ACCOUNT

You need a Unity Account to shop in the Online and Asset Stores, participate in the Unity Community and manage your license portfolio. Login Create account
  • Blog
  • Forums
  • Answers
  • Evangelists
  • User Groups
  • Beta Program
  • Advisory Panel

Navigation

  • Home
  • Products
  • Solutions
  • Made with Unity
  • Learning
  • Support & Services
  • Community
    • Blog
    • Forums
    • Answers
    • Evangelists
    • User Groups
    • Beta Program
    • Advisory Panel

Unity account

You need a Unity Account to shop in the Online and Asset Stores, participate in the Unity Community and manage your license portfolio. Login Create account

Language

  • Chinese
  • Spanish
  • Japanese
  • Korean
  • Portuguese
  • Ask a question
  • Spaces
    • Default
    • Help Room
    • META
    • Moderators
    • Topics
    • Questions
    • Users
    • Badges
  • Home /
avatar image
3
Question by CapeGuyBen · May 20, 2017 at 12:55 PM · xmltestingnunit

Can Unity 5.6's Test Runner optionally output NUnit 2 xml format?

As Unity 5.6 has upgraded to NUnit 3.6, the outputted test results xml file from has changed from the NUnit 2 to NUnit3 format. I'm using a TeamCity continuous integration server and their 'XML Report Processing' Build Feature only supports the NUnit 2 result xml format.

I've worked around this for now by writing an NUnit3 -> NUnit2 format conversion script which I run after Unity has finished but that doesn't seem like a very good long term solution. I will request NUnit 3 support is added to TeamCity's XML Report Processing feature but I also understand there is a way to make NUnit 3 output NUnit 2 format xml files.

Is there a command line option for the Unity Test Runner to control the NUnit results xml format (between NUnit 3 and NUnit 2)?

Comment
Add comment · Show 3
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users
avatar image wootencl · Jun 28, 2017 at 02:12 PM 0
Share

@CapeGuyBen Any chance you would be willing to share that format conversion script? Also suffering from this issue at the moment.

avatar image CapeGuyBen wootencl · Jul 06, 2017 at 01:55 PM 0
Share

Yep, see my follow up answer below.

avatar image wootencl CapeGuyBen · Jul 06, 2017 at 05:58 PM 0
Share

@CapeGuyBen Thanks! Your script did the trick.

2 Replies

· Add your reply
  • Sort: 
avatar image
1
Best Answer

Answer by CapeGuyBen · Jul 06, 2017 at 01:48 PM

Update: This actually turned out to be an issue with the files Unity was outputting rather than TeamCity's parsing. I have checked and it is fixed in Unity 2017.2.0f3 so I am no longer using the conversion script provided below.

I'll leave my original answer in case anyone can't use an up-to-date enough version of Unity with the fix:

Here's the converter I wrote and had been using for a while on 3 separate projects without issue. I threw the script together as a temporary workaround and am not proud of it. If you're unable to upgrade Unity for whatever reason then you could also try this.

NUnit3To2FormatConverter.py:

 #! /usr/bin/env python
 
 import argparse
 import os
 import sys
 import xml.dom.minidom
 
 
 def convert(input_path, output_path):
     ''' Converts the input file from the NUnit 3 format to NUnit 2 format
         and writes the result into output_path.
     '''
     
     def read_input(input_path):
         dom_tree = xml.dom.minidom.parse(input_path)
         collection = dom_tree.documentElement
 
         tests = collection.getElementsByTagName('test-case')
 
         results = {}
         results['name'] = collection.getAttribute('name')
         results['time'] = collection.getAttribute('duration')
 
         test_results = []
         results['test_results'] = test_results
 
         num_tests = 0
         num_passed_tests = 0
         num_failed_tests = 0
         num_ignored_tests = 0
         num_skipped_tests = 0
 
         for test in tests:
             num_tests = num_tests + 1
             curr_test_results = {}
             curr_test_results['name'] = test.getAttribute('fullname')
             test_new_result = test.getAttribute('result')
             test_new_runstate = test.getAttribute('runstate')
 
             if (test_new_result == 'Passed'):
                 # Test Passed...
                 num_passed_tests = num_passed_tests + 1
                 curr_test_results['state'] = 'Success'
                 curr_test_results['result'] = 'Success'
                 curr_test_results['executed'] = 'True'
                 curr_test_results['state'] = 'Success'
                 curr_test_results['success'] = 'True'
                 curr_test_results['time'] = test.getAttribute('duration')
             elif (test_new_result == 'Failed'):
                 if (test_new_runstate == 'Runnable'):
                     # Test Failed...
                     num_failed_tests = num_failed_tests + 1
                     curr_test_results['state'] = 'Failed'
                     curr_test_results['result'] = 'Failure'
                     curr_test_results['executed'] = 'True'
                     curr_test_results['success'] = 'False'
                     curr_test_results['time'] = test.getAttribute('duration')
 
                     failure_elem = test.getElementsByTagName('failure')[0]
 
                     curr_test_results['failure_message'] = failure_elem.getElementsByTagName('message')[0].firstChild.wholeText
                     curr_test_results['failure_stacktrace'] = failure_elem.getElementsByTagName('stack-trace')[0].firstChild.wholeText
                 else:
                     # Test could not be run...
                     num_skipped_tests = num_skipped_tests + 1
                     assert(test_new_runstate == 'NotRunnable')
                     curr_test_results['state'] = 'NotRunnable'
                     curr_test_results['result'] = 'NotRunnable'
                     curr_test_results['executed'] = 'False'
                     curr_test_results['success'] = 'False'
                     curr_test_results['time'] = test.getAttribute('duration')
 
                     curr_test_results['reason_message'] = test.getElementsByTagName('failure')[0].getElementsByTagName('message')[0].firstChild.wholeText
             elif (test_new_result == 'Skipped'):
                 # Test was Ignored...
                 num_ignored_tests = num_ignored_tests + 1
                 curr_test_results['state'] = 'Ignored'
                 curr_test_results['result'] = 'Ignored'
                 curr_test_results['executed'] = 'False'
                 curr_test_results['reason_message'] = test.getElementsByTagName('reason')[0].getElementsByTagName('message')[0].firstChild.wholeText
             else:
                 assert(False) #Unknown test result type?
             test_results.append(curr_test_results)
 
         results['num_tests'] = num_tests
         results['num_passed_tests'] = num_passed_tests
         results['num_failed_tests'] = num_failed_tests
         results['num_ignored_tests'] = num_ignored_tests
         results['num_skipped_tests'] = num_skipped_tests
 
         date_time = collection.getAttribute('start-time').split(' ')
         results['date'] = date_time[0]
         results['time'] = date_time[1]
 
         return results
 
     def write_output(results, output_path):
         # Write XML File (minidom)
 
         doc = xml.dom.minidom.Document()
 
         num_tests = results['num_tests']
         num_skipped_tests = results['num_skipped_tests']
         num_ignored_tests = results['num_ignored_tests']
         num_not_run_tests = num_skipped_tests + num_ignored_tests
         num_failed_tests = results['num_failed_tests']
 
         suite_executed = (num_tests - num_not_run_tests) > 0
         suite_success = num_skipped_tests + num_failed_tests == 0
 
         root = doc.createElement('test-results')
         root.setAttribute('name', 'Unity Tests')
         root.setAttribute('total', str(num_tests - num_not_run_tests))
         root.setAttribute('errors', str(0))
         root.setAttribute('failures', str(num_failed_tests))
         root.setAttribute('not-run', str(num_not_run_tests))
         root.setAttribute('inconclusive', str(0))
         root.setAttribute('ignored', str(num_ignored_tests))
         root.setAttribute('skipped', str(num_skipped_tests))
         root.setAttribute('invalid', str(0))
         root.setAttribute('date', str(results['date']))
         root.setAttribute('time', str(results['time']))
         doc.appendChild(root)
 
         test_suite = doc.createElement('test-suite')
         test_suite.setAttribute('name', results['name'])
         test_suite.setAttribute('type', 'Assembly')
         test_suite.setAttribute('executed', 'True' if suite_executed else 'False')
         test_suite.setAttribute('result', 'Success' if suite_success else 'Failure')
         test_suite.setAttribute('success', 'True' if suite_success else 'False')
         test_suite.setAttribute('time', results['time'])
         root.appendChild(test_suite)
 
         results_elem = doc.createElement('results')
         test_suite.appendChild(results_elem)
         
 
         test_results = results['test_results']
         for curr_test_results in test_results:
 
             test_case = doc.createElement('test-case')
             results_elem.appendChild(test_case)
 
             test_case.setAttribute('name', curr_test_results['name'])
             test_case.setAttribute('executed', curr_test_results['executed'])
             test_case.setAttribute('result', curr_test_results['result'])
 
             run_state = curr_test_results['state']
             if (run_state == 'Success'):
                 # Success...
                 test_case.setAttribute('success', curr_test_results['success'])
                 test_case.setAttribute('time', curr_test_results['time'])
 
             elif (run_state == 'Failed'):
                 # Failed...
                 test_case.setAttribute('success', curr_test_results['success'])
                 test_case.setAttribute('time', curr_test_results['time'])
 
                 failure = doc.createElement('failure')
                 test_case.appendChild(failure)
 
                 message = doc.createElement('message')
                 message_cdata = doc.createCDATASection(curr_test_results['failure_message'])
                 message.appendChild(message_cdata)
                 failure.appendChild(message)
 
                 stack_trace = doc.createElement('stack-trace')
                 stack_trace_cdata = doc.createCDATASection(curr_test_results['failure_stacktrace'])
                 stack_trace.appendChild(stack_trace_cdata)
                 failure.appendChild(stack_trace)
 
             elif (run_state == 'NotRunnable'):
                 # Not Runnable...
                 test_case.setAttribute('success', curr_test_results['success'])
                 test_case.setAttribute('time', curr_test_results['time'])
                 
                 reason = doc.createElement('reason')
                 test_case.appendChild(reason)
 
                 message = doc.createElement('message')
                 message_cdata = doc.createCDATASection(curr_test_results['reason_message'])
                 message.appendChild(message_cdata)
                 reason.appendChild(message)
 
             elif(run_state == 'Ignored'):
 
                 reason = doc.createElement('reason')
                 test_case.appendChild(reason)
 
                 message = doc.createElement('message')
                 message_cdata = doc.createCDATASection(curr_test_results['reason_message'])
                 message.appendChild(message_cdata)
                 reason.appendChild(message)
 
             else:
                 print ("Unknown run state: " + run_state)
 
         doc.writexml( open(output_path, 'w'),
                       indent="    ",
                       addindent="    ",
                       newl='\n')
 
         doc.unlink()
 
     results = read_input(input_path)
     write_output(results, output_path)
 
 
 def main():
     parser = argparse.ArgumentParser(description='Convert an NUnit 3 file to an NUnit 2 file')
     required_named = parser.add_argument_group('Required named arguments')
     required_named.add_argument('-i', '--input', dest='input', help='Input file name', required=True)
     required_named.add_argument('-o', '--output', dest='output', help='Output file name', required=True)
     args = parser.parse_args()
 
     input_path = args.input
     output_path = args.output
 
     if (not os.path.isfile(input_path)):
         print ("Input file does not exist")
         return 1
 
     print ("Converting " + input_path + " to " + output_path)
     convert(input_path, output_path)
     return 0
 
 
 if __name__ == "__main__":
     sys.exit(main())
 
Comment
Add comment · Show 4 · Share
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users
avatar image pdondziakPix · Jul 06, 2017 at 02:02 PM 0
Share

@CapeGuyBen I'm using TeamCity 2017.1.2 and it still has this issue. But there was some work done in this module because the debug messages have improved in comparison to version 10.0.4

avatar image CapeGuyBen pdondziakPix · Jul 06, 2017 at 02:10 PM 0
Share

Yep, just checked the ticket and they've edited it recently to say to be fixed in Indore 2017.1.3 ins$$anonymous$$d. They had already done that, I just didn't notice before writing my answer. I've updated it now.

avatar image fjhamming_CleVR · Jul 06, 2017 at 03:04 PM 0
Share

Thank you. This solved my problem

avatar image wootencl · Jul 06, 2017 at 07:01 PM 0
Share

Tests were ignored for some reason with @pdondziakPix's script. $$anonymous$$ay be because we're using TC 9.1.5. This script worked very well though.

avatar image
0

Answer by fjhamming_CleVR · Jul 06, 2017 at 10:23 AM

Your solutions seems promising. I tried to implement this in our bamboo workflow but it doesn't seem to be able to parse the results.

Of course I understand that the script is supplied as-is.

Any clue what might be a difference between the teamcity parser and the bamboo parser?

For completeness I added the original and converted file to this post.

[1]: /storage/temp/97236-testresults.zip


testresults.zip (2.7 kB)
Comment
Add comment · Show 6 · Share
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users
avatar image pdondziakPix · Jul 06, 2017 at 11:00 AM 0
Share

@fjham$$anonymous$$g_CleVR Any debug messages from bamboo during results parsing? TeamCity was very helpful by providing error messages like missing test-results tag.

avatar image pdondziakPix · Jul 06, 2017 at 11:08 AM 0
Share

@fjham$$anonymous$$g_CleVR Try parsing this results in bamboo

https://github.com/x97mdr/pickles/blob/master/src/Pickles/Pickles.Test/results-example-nunit.xml

I used it as a working example in TeamCity

avatar image fjhamming_CleVR pdondziakPix · Jul 06, 2017 at 11:56 AM 0
Share

A big difference between the converted xml and your test xml is the inclusion of attributes in the root element, the inclusion of an environment element and the culture-info element. Could be that these are required for the bamboo parser?

avatar image pdondziakPix fjhamming_CleVR · Jul 06, 2017 at 12:01 PM 0
Share

could be, our converter just adds test-results root element and adds results tag under every test-suite tag he finds

Show more comments
avatar image fjhamming_CleVR · Jul 06, 2017 at 11:46 AM 0
Share

bamboo's log says

06-Jul-2017 11:22:13 Starting task 'Validate Unity Integration Test Results' of type 'com.atlassian.bamboo.plugin.dotnet:nunit' 06-Jul-2017 11:22:13 Parsing test results under J:\B\FRA$$anonymous$$-SPAW-IT... 06-Jul-2017 11:22:13 Finished task 'Validate Unity Integration Test Results' with result: Success

while the dashboard just says that 7 tests were skipped. (the same result as not converting the xml)

The example you provided just works by the way.

Your answer

Hint: You can notify a user about this post by typing @username

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this Question

Answers Answers and Comments

68 People are following this question.

avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image avatar image

Related Questions

NUnit and VisualStudio 2 Answers

Asynchronous unit testing 1 Answer

NUnit delayed constraint does not appear to work in playmode test 0 Answers

Creating Test Doubles for Unity3D Objects 2 Answers

Unity test tool simulate holding key ? 0 Answers


Enterprise
Social Q&A

Social
Subscribe on YouTube social-youtube Follow on LinkedIn social-linkedin Follow on Twitter social-twitter Follow on Facebook social-facebook Follow on Instagram social-instagram

Footer

  • Purchase
    • Products
    • Subscription
    • Asset Store
    • Unity Gear
    • Resellers
  • Education
    • Students
    • Educators
    • Certification
    • Learn
    • Center of Excellence
  • Download
    • Unity
    • Beta Program
  • Unity Labs
    • Labs
    • Publications
  • Resources
    • Learn platform
    • Community
    • Documentation
    • Unity QA
    • FAQ
    • Services Status
    • Connect
  • About Unity
    • About Us
    • Blog
    • Events
    • Careers
    • Contact
    • Press
    • Partners
    • Affiliates
    • Security
Copyright © 2020 Unity Technologies
  • Legal
  • Privacy Policy
  • Cookies
  • Do Not Sell My Personal Information
  • Cookies Settings
"Unity", Unity logos, and other Unity trademarks are trademarks or registered trademarks of Unity Technologies or its affiliates in the U.S. and elsewhere (more info here). Other names or brands are trademarks of their respective owners.
  • Anonymous
  • Sign in
  • Create
  • Ask a question
  • Spaces
  • Default
  • Help Room
  • META
  • Moderators
  • Explore
  • Topics
  • Questions
  • Users
  • Badges