# LightGbmArguments.GossBooster.Arguments¶

The documentation is generated based on the sources available at dotnet/machinelearning and released under MIT License.

**Type:** argument
**Aliases:** *Microsoft.ML.Runtime.LightGBM.LightGbmArguments+GossBooster+Arguments*
**Namespace:** System
**Assembly:** Microsoft.ML.LightGBM.dll
**Microsoft Documentation:** LightGbmArguments.GossBooster.Arguments

**Description**

**Parameters**

Name | Short name | Default | Description |
---|---|---|---|

FeatureFraction | ShortName = “ff” | 1 | HelpText = “Subsample ratio of columns when constructing each tree. Range: (0,1].” |

MaxDepth | 0 | HelpText = “Maximum depth of a tree. 0 means no limit. However, tree still grows by best-first.” | |

MinChildWeight | 0.1 | HelpText = “Minimum sum of instance weight(hessian) needed in a child. If the tree partition step results in a leaf node with the sum of instance weight less than min_child_weight, then the building process will give up further partitioning. In linear regression mode, this simply corresponds to minimum number of instances needed to be in each node. The larger, the more conservative the algorithm will be.” | |

MinSplitGain | 0 | HelpText = “Minimum loss reduction required to make a further partition on a leaf node of the tree. the larger, the more conservative the algorithm will be.” | |

OtherRate | 0.1 | HelpText = “Retain ratio for small gradient instances.” | |

RegAlpha | ShortName = “l1” | 0 | HelpText = “L1 regularization term on weights, increase this value will make model more conservative.” |

RegLambda | ShortName = “l2” | 0.01 | HelpText = “L2 regularization term on weights, increasing this value will make model more conservative.” |

ScalePosWeight | 1 | HelpText = “Control the balance of positive and negative weights, useful for unbalanced classes. A typical value to consider: sum(negative cases) / sum(positive cases).” | |

Subsample | 1 | HelpText = “Subsample ratio of the training instance. Setting it to 0.5 means that LightGBM randomly collected half of the data instances to grow trees and this will prevent overfitting. Range: (0,1].” | |

SubsampleFreq | 0 | HelpText = “Subsample frequency. 0 means no subsample. If subsampleFreq > 0, it will use a subset(ratio=subsample) to train. And the subset will be updated on every Subsample iteratinos.” | |

TopRate | 0.2 | HelpText = “Retain ratio for large gradient instances.” | |

UnbalancedSets | ShortName = “us” | False | HelpText = “Use for binary classification when classes are not balanced.” |