Browse Source

enhance the prompt for unit testing

pull/108/head
Elena Shirokova 4 months ago
parent
commit
2e19655534
  1. 38
      week4/community-contributions/unit-tests-generator.ipynb

38
week4/community-contributions/unit-tests-generator.ipynb

@ -198,12 +198,28 @@
"source": [
"def get_user_prompt(code):\n",
"\n",
" user_prompt = \"Write for a python code the unit test cases.\"\n",
" user_prompt += \"Return readable unit tests cases using pytest library, do not create any custom imports, don't forget to import errors if needed; do not explain your work other than a few comments.\"\n",
" user_prompt += \"The tests should include normal inputs, the inputs where the code is expected to fail, edge case and error handling.\"\n",
" user_prompt += \"Do not insert the function to be tested in the output before the tests.\"\n",
" user_prompt = \"\"\"Test include:\n",
"\n",
" - Valid inputs with expected results.\n",
" - Inputs that test the boundaries or limits of the function's behavior.\n",
" - Invalid inputs or scenarios where the function is expected to raise exceptions.\n",
"\n",
" Structure:\n",
"\n",
" - Begin with all necessary imports. \n",
" - Do not create custom imports. \n",
" - Do not insert in the response the function for the tests.\n",
" - Ensure proper error handling for tests that expect exceptions.\n",
" - Clearly name the test functions to indicate their purpose (e.g., test_function_name).\n",
"\n",
" Example Structure:\n",
"\n",
" - Use pytest.raises to validate exceptions.\n",
" - Use assertions to verify correct outputs for successful and edge cases.\n",
"\n",
" Documentation:\n",
"\n",
" - Add docstrings explaining what each test verifies.\"\"\"\n",
" user_prompt += code\n",
"\n",
" return user_prompt"
@ -298,6 +314,8 @@
"source": [
"function_to_test = \"\"\"\n",
" def lengthOfLongestSubstring(s):\n",
" if not isinstance(s, str):\n",
" raise TypeError(\"Input must be a string\")\n",
" max_length = 0\n",
" substring = \"\"\n",
" start_idx = 0\n",
@ -343,7 +361,7 @@
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
@ -398,16 +416,10 @@
" save_test_run.click(save_unit_tests, inputs=[unit_tests_out])\n",
"\n",
"\n",
"ui.launch(inbrowser=True)"
"ui.launch(inbrowser=True)\n",
"# ui.launch()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,

Loading…
Cancel
Save