Skip to content

Commit

Permalink
[Fix] fix doxygen comments
Browse files Browse the repository at this point in the history
Signed-off-by: Jiho Chu <[email protected]>
  • Loading branch information
jihochu committed Mar 7, 2024
1 parent c63d80b commit afa9eff
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 3 deletions.
5 changes: 3 additions & 2 deletions nntrainer/graph/network_graph.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,8 @@ int NetworkGraph::compile(const std::string &loss_type) {
status = checkCompiledGraph();
NN_RETURN_STATUS();

/* @note It can be integrated with addLossLayer method
/**
* @note It can be integrated with addLossLayer method
* if it removes adding loss layer to the model directly.
*/
for (auto iter = cbegin(); iter != cend(); iter++) {
Expand Down Expand Up @@ -447,7 +448,7 @@ void NetworkGraph::backwarding(

// check first layer's derivative is valid
// loss scale is adjusted between 1.0f ~ 256.0f
// @TODO provide max scale property
// @todo provide max scale property
auto &ln = *(cbegin() + 1);
if (loss_scale != 0.0f && !ln->getRunContext().validateDerivatives()) {
// It will not apply train results if data is invalid
Expand Down
3 changes: 2 additions & 1 deletion nntrainer/layers/loss/mse_loss_layer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,8 @@ void MSELossLayer::calcDerivative(RunLayerContext &context) {

float divider = ((float)y.size()) / 2;

/* ret_derivative may be eliminated by big divider with fp16 calculation.
/**
* ret_derivative may be eliminated by big divider with fp16 calculation.
* So, it calcuated with larger precision.
*/
int ret;
Expand Down

0 comments on commit afa9eff

Please sign in to comment.