# 2ticketss
2 complementary command-line tools for GitHub issue management - **00-jira-to-gh-issues**: A Rust tool that converts Jira CSV exports to GitHub issue markdown files compatible with gh-issue-generator. It handles messy CSV data and preserves issue metadata - **01-gh-issue-generator**: A Rust tool that creates GitHub issues from Markdown files with YAML front matter. It parses structured Markdown, supports batch processing, and integrates with GitHub CLI
This commit is contained in:
8
.gitignore
vendored
8
.gitignore
vendored
@@ -20,3 +20,11 @@ Cargo.lock
|
|||||||
# and can be added to the global gitignore or merged into this file. For a more nuclear
|
# and can be added to the global gitignore or merged into this file. For a more nuclear
|
||||||
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
|
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
|
||||||
#.idea/
|
#.idea/
|
||||||
|
|
||||||
|
# Generated output directories
|
||||||
|
output/
|
||||||
|
00-jira-to-gh-issues/output/
|
||||||
|
01-gh-issue-generator/report/
|
||||||
|
|
||||||
|
# macOS files
|
||||||
|
.DS_Store
|
||||||
17
00-jira-to-gh-issues/Cargo.toml
Normal file
17
00-jira-to-gh-issues/Cargo.toml
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
[package]
|
||||||
|
name = "jira-to-gh-issues"
|
||||||
|
version = "0.1.0"
|
||||||
|
edition = "2021"
|
||||||
|
description = "A tool to convert Jira CSV exports to GitHub issue markdown format"
|
||||||
|
authors = ["b3"]
|
||||||
|
|
||||||
|
[dependencies]
|
||||||
|
clap = { version = "4.4", features = ["derive"] }
|
||||||
|
csv = "1.2"
|
||||||
|
anyhow = "1.0"
|
||||||
|
chrono = "0.4"
|
||||||
|
serde = { version = "1.0", features = ["derive"] }
|
||||||
|
dirs = "5.0"
|
||||||
|
toml = "0.8"
|
||||||
|
log = "0.4"
|
||||||
|
env_logger = "0.10"
|
||||||
232
00-jira-to-gh-issues/README.md
Normal file
232
00-jira-to-gh-issues/README.md
Normal file
@@ -0,0 +1,232 @@
|
|||||||
|
# Jira to GitHub Issues Converter
|
||||||
|
|
||||||
|
A Rust command-line tool that converts Jira CSV exports to GitHub issue markdown files compatible with the [gh-issue-generator](../01-gh-issue-generator) tool.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- [Rust](https://www.rust-lang.org/tools/install) installed (cargo, rustc)
|
||||||
|
- Basic familiarity with Jira CSV exports
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Option 1: Build from source
|
||||||
|
|
||||||
|
1. Clone this repository
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/bee8333/2ticketss.git
|
||||||
|
cd 2ticketss/00-jira-to-gh-issues
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Build the application:
|
||||||
|
```bash
|
||||||
|
cargo build --release
|
||||||
|
```
|
||||||
|
|
||||||
|
3. The executable will be in `target/release/jira-to-gh-issues`
|
||||||
|
|
||||||
|
4. Optional: Add to your PATH
|
||||||
|
```bash
|
||||||
|
# Linux/macOS
|
||||||
|
cp target/release/jira-to-gh-issues ~/.local/bin/
|
||||||
|
|
||||||
|
# or add the following to your .bashrc or .zshrc
|
||||||
|
export PATH="$PATH:/path/to/2ticketss/00-jira-to-gh-issues/target/release"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 2: Install with Cargo
|
||||||
|
|
||||||
|
If you have Rust installed, you can install directly with:
|
||||||
|
```bash
|
||||||
|
cargo install --path .
|
||||||
|
```
|
||||||
|
|
||||||
|
## Exporting Issues from Jira
|
||||||
|
|
||||||
|
1. In Jira, go to the issue list view (with all filters applied)
|
||||||
|
2. Click "Export" and select "CSV (All fields)"
|
||||||
|
3. Save the CSV file
|
||||||
|
|
||||||
|
Required Jira fields:
|
||||||
|
- Summary (required)
|
||||||
|
- Issue key (required)
|
||||||
|
|
||||||
|
Recommended fields (for best results):
|
||||||
|
- Issue Type
|
||||||
|
- Priority
|
||||||
|
- Description
|
||||||
|
- Assignee
|
||||||
|
- Reporter
|
||||||
|
- Created
|
||||||
|
- Labels
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```bash
|
||||||
|
jira-to-gh-issues [OPTIONS] --input <jira_export.csv>
|
||||||
|
```
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
- `--input, -i <jira_export.csv>`: Path to the CSV file exported from Jira (required)
|
||||||
|
|
||||||
|
Options:
|
||||||
|
- `--output, -o <output_dir>`: Directory where the markdown files will be saved
|
||||||
|
- `--status, -s <status>`: Default status for the issues (draft or ready)
|
||||||
|
- `--verbose, -v`: Increase verbosity level (can be used multiple times: -v, -vv)
|
||||||
|
- `--config, -c <config>`: Path to config file
|
||||||
|
|
||||||
|
### Logging Options
|
||||||
|
|
||||||
|
You can control log verbosity in two ways:
|
||||||
|
|
||||||
|
1. Using the `--verbose` flag:
|
||||||
|
- `-v`: Debug level logging
|
||||||
|
- `-vv`: Trace level logging
|
||||||
|
|
||||||
|
2. Using the `RUST_LOG` environment variable:
|
||||||
|
```bash
|
||||||
|
# Set global log level
|
||||||
|
RUST_LOG=debug jira-to-gh-issues --input issues.csv
|
||||||
|
|
||||||
|
# Set per-module log level
|
||||||
|
RUST_LOG=jira_to_gh_issues=trace,csv=warn jira-to-gh-issues --input issues.csv
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
The tool supports configuration files to store default settings. Configuration is searched in these locations (in order):
|
||||||
|
|
||||||
|
1. Path specified with `--config` option
|
||||||
|
2. `.jira-to-gh-issues.toml` in current directory
|
||||||
|
3. `~/.jira-to-gh-issues.toml` in home directory
|
||||||
|
4. `~/.config/jira-to-gh-issues/config.toml` in XDG config directory
|
||||||
|
|
||||||
|
A default configuration file will be created at `~/.config/jira-to-gh-issues/config.toml` if none exists.
|
||||||
|
|
||||||
|
### Configuration Options
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# Default output directory
|
||||||
|
output_dir = "00-jira-to-gh-issues/output"
|
||||||
|
|
||||||
|
# Default issue status (draft or ready)
|
||||||
|
default_status = "ready"
|
||||||
|
|
||||||
|
# Required fields in CSV
|
||||||
|
required_fields = ["Summary", "Issue key"]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Creating or Editing Configuration
|
||||||
|
|
||||||
|
You can manually create or edit the configuration file, or run the tool once to create a default configuration automatically.
|
||||||
|
|
||||||
|
For example, to set custom defaults:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir -p ~/.config/jira-to-gh-issues
|
||||||
|
cat > ~/.config/jira-to-gh-issues/config.toml << EOF
|
||||||
|
output_dir = "github-issues/jira-exports"
|
||||||
|
default_status = "draft"
|
||||||
|
required_fields = ["Summary", "Issue key", "Description"]
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example Workflow
|
||||||
|
|
||||||
|
### 1. Export issues from Jira
|
||||||
|
```bash
|
||||||
|
# Assuming you've exported Jira issues to "sprint-backlog.csv"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Convert to GitHub issue format
|
||||||
|
```bash
|
||||||
|
jira-to-gh-issues --input sprint-backlog.csv --output github-issues
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Create GitHub issues using gh-issue-generator
|
||||||
|
```bash
|
||||||
|
cd ..
|
||||||
|
gh-issue-generator --repo myuser/myrepo github-issues/20240405123045/batch.md
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
1. The tool reads a CSV file exported from Jira
|
||||||
|
2. For each issue in the CSV, it:
|
||||||
|
- Extracts relevant information (title, description, labels, etc.)
|
||||||
|
- Converts it to the GitHub issue markdown format with YAML front matter
|
||||||
|
- Writes individual markdown files for each issue
|
||||||
|
- Creates a batch file containing all issues
|
||||||
|
3. Output is saved in a timestamped directory to avoid overwriting previous conversions
|
||||||
|
4. An error log is generated for any issues that couldn't be processed
|
||||||
|
|
||||||
|
## Output Format
|
||||||
|
|
||||||
|
The tool generates:
|
||||||
|
1. Individual markdown files for each issue (named after the Jira issue key)
|
||||||
|
2. A batch file containing all issues in the format expected by gh-issue-generator
|
||||||
|
3. An error log file listing any processing issues
|
||||||
|
|
||||||
|
Each generated file follows the format required by the gh-issue-generator tool:
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
title: Issue title
|
||||||
|
status: ready (or draft)
|
||||||
|
labels:
|
||||||
|
- label1
|
||||||
|
- label2
|
||||||
|
assignees:
|
||||||
|
- assignee1
|
||||||
|
---
|
||||||
|
|
||||||
|
# Issue title
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
Issue description...
|
||||||
|
|
||||||
|
## Metadata
|
||||||
|
|
||||||
|
- **Jira Issue**: JIRA-123
|
||||||
|
- **Created**: Creation date
|
||||||
|
- **Reporter**: Reporter name
|
||||||
|
```
|
||||||
|
|
||||||
|
## Mapping Rules
|
||||||
|
|
||||||
|
| Jira Field | GitHub Issue Field |
|
||||||
|
|---------------|---------------------------|
|
||||||
|
| Summary | title |
|
||||||
|
| Issue Type | Converted to a label |
|
||||||
|
| Priority | High/Critical → priority label |
|
||||||
|
| Labels | labels |
|
||||||
|
| Description | Description section |
|
||||||
|
| Assignee | assignees |
|
||||||
|
| Reporter | Listed in metadata |
|
||||||
|
| Created | Listed in metadata |
|
||||||
|
| Issue Key | Listed in metadata & filename |
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### CSV Import Issues
|
||||||
|
- Ensure the CSV export from Jira includes the required fields (Summary, Issue key)
|
||||||
|
- If the import fails with "Missing required columns", verify your Jira export contains these field names
|
||||||
|
- For malformed CSV files, try opening in a spreadsheet application and re-saving
|
||||||
|
|
||||||
|
### Character Encoding
|
||||||
|
- If you see corrupted characters, ensure your CSV is UTF-8 encoded
|
||||||
|
- Some Jira instances export in different encodings depending on region settings
|
||||||
|
|
||||||
|
### Large Exports
|
||||||
|
- For very large exports (1000+ issues), consider splitting into multiple files
|
||||||
|
- Processing large files may require more memory
|
||||||
|
|
||||||
|
## Limitations
|
||||||
|
|
||||||
|
- Complex Jira markup may not convert perfectly to Markdown
|
||||||
|
- Attachments from Jira issues are not transferred
|
||||||
|
- Comments are not included in the conversion
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT
|
||||||
548
00-jira-to-gh-issues/src/main.rs
Normal file
548
00-jira-to-gh-issues/src/main.rs
Normal file
@@ -0,0 +1,548 @@
|
|||||||
|
use std::fs::{self, File};
|
||||||
|
use std::io::{Write, Read};
|
||||||
|
use std::path::PathBuf;
|
||||||
|
use std::collections::HashMap;
|
||||||
|
|
||||||
|
use anyhow::{Context, Result, anyhow};
|
||||||
|
use chrono::Local;
|
||||||
|
use clap::Parser;
|
||||||
|
use csv::ReaderBuilder;
|
||||||
|
use dirs::home_dir;
|
||||||
|
use log::{info, warn, error, debug, trace, LevelFilter};
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
|
||||||
|
/// Jira CSV to GitHub Issues Converter
|
||||||
|
///
|
||||||
|
/// This tool converts Jira CSV exports to GitHub issue markdown files
|
||||||
|
/// that are compatible with the gh-issue-generator tool. It handles field
|
||||||
|
/// mapping, formatting conversions, and generates both individual markdown
|
||||||
|
/// files and a batch file for bulk creation of GitHub issues.
|
||||||
|
#[derive(Parser, Debug)]
|
||||||
|
#[command(
|
||||||
|
author,
|
||||||
|
version,
|
||||||
|
about,
|
||||||
|
long_about = "A command-line utility that converts Jira CSV exports to GitHub issue markdown files.
|
||||||
|
It extracts key fields from the Jira export, including summary, description, issue type, priority,
|
||||||
|
labels, assignees, and other metadata, then converts them to GitHub-compatible format.
|
||||||
|
|
||||||
|
The tool creates a timestamped output directory containing:
|
||||||
|
- Individual markdown files for each issue
|
||||||
|
- A batch file containing all issues
|
||||||
|
- An error log tracking any issues encountered during processing
|
||||||
|
|
||||||
|
For best results, ensure your Jira export includes at least the Summary and Issue key fields."
|
||||||
|
)]
|
||||||
|
struct Args {
|
||||||
|
/// Input CSV file exported from Jira
|
||||||
|
///
|
||||||
|
/// Path to the CSV file exported from Jira containing issues to convert.
|
||||||
|
/// Export this file from Jira by going to the issue list view,
|
||||||
|
/// clicking 'Export' and selecting 'CSV (All fields)'.
|
||||||
|
#[arg(short, long, required = true)]
|
||||||
|
input: PathBuf,
|
||||||
|
|
||||||
|
/// Output directory for generated markdown files
|
||||||
|
///
|
||||||
|
/// Directory where the markdown files will be saved. A timestamped
|
||||||
|
/// subdirectory will be created within this path to avoid overwriting
|
||||||
|
/// previous conversions.
|
||||||
|
#[arg(short, long)]
|
||||||
|
output: Option<PathBuf>,
|
||||||
|
|
||||||
|
/// Default status to use (draft or ready)
|
||||||
|
///
|
||||||
|
/// Determines whether created issues are marked as draft or ready.
|
||||||
|
/// Draft issues will not be created when using gh-issue-generator.
|
||||||
|
#[arg(short, long)]
|
||||||
|
status: Option<String>,
|
||||||
|
|
||||||
|
/// Verbosity level (can be used multiple times)
|
||||||
|
///
|
||||||
|
/// Controls the amount of information displayed during execution:
|
||||||
|
/// -v: Debug level (detailed information for troubleshooting)
|
||||||
|
/// -vv: Trace level (very verbose, all operations logged)
|
||||||
|
///
|
||||||
|
/// You can also use the RUST_LOG environment variable instead.
|
||||||
|
#[arg(short, long, action = clap::ArgAction::Count)]
|
||||||
|
verbose: u8,
|
||||||
|
|
||||||
|
/// Path to config file
|
||||||
|
///
|
||||||
|
/// Path to a TOML configuration file. If not specified, the tool will
|
||||||
|
/// search for configuration in standard locations:
|
||||||
|
/// - .jira-to-gh-issues.toml in current directory
|
||||||
|
/// - ~/.jira-to-gh-issues.toml in home directory
|
||||||
|
/// - ~/.config/jira-to-gh-issues/config.toml
|
||||||
|
#[arg(short = 'c', long)]
|
||||||
|
config: Option<PathBuf>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
struct Config {
|
||||||
|
/// Default output directory
|
||||||
|
output_dir: Option<String>,
|
||||||
|
|
||||||
|
/// Default issue status (draft or ready)
|
||||||
|
default_status: Option<String>,
|
||||||
|
|
||||||
|
/// Required fields in CSV (comma-separated list)
|
||||||
|
#[serde(default = "default_required_fields")]
|
||||||
|
required_fields: Vec<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_required_fields() -> Vec<String> {
|
||||||
|
vec!["Summary".to_string(), "Issue key".to_string()]
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Default for Config {
|
||||||
|
fn default() -> Self {
|
||||||
|
Self {
|
||||||
|
output_dir: Some("00-jira-to-gh-issues/output".to_string()),
|
||||||
|
default_status: Some("ready".to_string()),
|
||||||
|
required_fields: default_required_fields(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Config {
|
||||||
|
/// Load configuration from file or create default if not exists
|
||||||
|
fn load(config_path: Option<&PathBuf>) -> Result<Self> {
|
||||||
|
// If path is provided, try to load from there
|
||||||
|
if let Some(path) = config_path {
|
||||||
|
debug!("Loading config from specified path: {}", path.display());
|
||||||
|
return Self::load_from_file(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to load from default locations
|
||||||
|
let config_paths = [
|
||||||
|
// Current directory
|
||||||
|
PathBuf::from(".jira-to-gh-issues.toml"),
|
||||||
|
// Home directory
|
||||||
|
home_dir().map(|h| h.join(".jira-to-gh-issues.toml")).unwrap_or_default(),
|
||||||
|
// XDG config directory
|
||||||
|
home_dir().map(|h| h.join(".config/jira-to-gh-issues/config.toml")).unwrap_or_default(),
|
||||||
|
];
|
||||||
|
|
||||||
|
for path in &config_paths {
|
||||||
|
if path.exists() {
|
||||||
|
debug!("Found config file at: {}", path.display());
|
||||||
|
return Self::load_from_file(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// No config file found, use defaults
|
||||||
|
debug!("No config file found, using defaults");
|
||||||
|
Ok(Self::default())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Load configuration from a specific file
|
||||||
|
fn load_from_file(path: &PathBuf) -> Result<Self> {
|
||||||
|
let mut file = File::open(path)
|
||||||
|
.context(format!("Failed to open config file: {}", path.display()))?;
|
||||||
|
|
||||||
|
let mut content = String::new();
|
||||||
|
file.read_to_string(&mut content)
|
||||||
|
.context("Failed to read config file")?;
|
||||||
|
|
||||||
|
toml::from_str(&content)
|
||||||
|
.context("Failed to parse config file as TOML")
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Save configuration to file
|
||||||
|
fn save(&self, path: &PathBuf) -> Result<()> {
|
||||||
|
debug!("Saving config to: {}", path.display());
|
||||||
|
|
||||||
|
// Create parent directories if they don't exist
|
||||||
|
if let Some(parent) = path.parent() {
|
||||||
|
fs::create_dir_all(parent)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
let content = toml::to_string_pretty(self)
|
||||||
|
.context("Failed to serialize config to TOML")?;
|
||||||
|
|
||||||
|
let mut file = File::create(path)
|
||||||
|
.context(format!("Failed to create config file: {}", path.display()))?;
|
||||||
|
|
||||||
|
file.write_all(content.as_bytes())
|
||||||
|
.context("Failed to write config file")?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Create a default config file if it doesn't exist
|
||||||
|
fn create_default_if_not_exists() -> Result<()> {
|
||||||
|
let default_path = home_dir()
|
||||||
|
.map(|h| h.join(".config/jira-to-gh-issues/config.toml"))
|
||||||
|
.ok_or_else(|| anyhow!("Failed to determine home directory"))?;
|
||||||
|
|
||||||
|
if !default_path.exists() {
|
||||||
|
debug!("Creating default config file at: {}", default_path.display());
|
||||||
|
let default_config = Self::default();
|
||||||
|
default_config.save(&default_path)?;
|
||||||
|
info!("Created default config file at: {}", default_path.display());
|
||||||
|
println!("Created default config file at: {}", default_path.display());
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
struct ProcessingStats {
|
||||||
|
total_rows: usize,
|
||||||
|
successful_conversions: usize,
|
||||||
|
empty_rows: usize,
|
||||||
|
parsing_errors: usize,
|
||||||
|
required_field_missing: HashMap<String, usize>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl ProcessingStats {
|
||||||
|
fn new() -> Self {
|
||||||
|
Self {
|
||||||
|
total_rows: 0,
|
||||||
|
successful_conversions: 0,
|
||||||
|
empty_rows: 0,
|
||||||
|
parsing_errors: 0,
|
||||||
|
required_field_missing: HashMap::new(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn print_summary(&self) {
|
||||||
|
println!("\nProcessing Statistics:");
|
||||||
|
println!(" Total rows in CSV: {}", self.total_rows);
|
||||||
|
println!(" Successfully converted: {}", self.successful_conversions);
|
||||||
|
println!(" Empty rows skipped: {}", self.empty_rows);
|
||||||
|
println!(" Rows with parsing errors: {}", self.parsing_errors);
|
||||||
|
|
||||||
|
if !self.required_field_missing.is_empty() {
|
||||||
|
println!("\n Required fields missing:");
|
||||||
|
for (field, count) in &self.required_field_missing {
|
||||||
|
println!(" {}: {} occurrences", field, count);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() -> Result<()> {
|
||||||
|
let args = Args::parse();
|
||||||
|
|
||||||
|
// Initialize logger with appropriate level
|
||||||
|
let log_level = match args.verbose {
|
||||||
|
0 => LevelFilter::Info,
|
||||||
|
1 => LevelFilter::Debug,
|
||||||
|
_ => LevelFilter::Trace,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Initialize the logger, respecting the RUST_LOG env var if set
|
||||||
|
let mut builder = env_logger::Builder::new();
|
||||||
|
|
||||||
|
// If RUST_LOG is set, use that config; otherwise use our default based on -v flags
|
||||||
|
match std::env::var("RUST_LOG") {
|
||||||
|
Ok(_) => {
|
||||||
|
// RUST_LOG is set, initialize with default env_logger behavior
|
||||||
|
builder.init();
|
||||||
|
},
|
||||||
|
Err(_) => {
|
||||||
|
// RUST_LOG not set, use our verbosity flag
|
||||||
|
builder
|
||||||
|
.filter_level(log_level)
|
||||||
|
.format_timestamp(Some(env_logger::fmt::TimestampPrecision::Seconds))
|
||||||
|
.init();
|
||||||
|
|
||||||
|
// Log a hint about RUST_LOG for users who use -v
|
||||||
|
if args.verbose > 0 {
|
||||||
|
debug!("Tip: You can also control log level with the RUST_LOG environment variable");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
info!("Starting Jira to GitHub Issues Converter");
|
||||||
|
|
||||||
|
// Load configuration
|
||||||
|
let config = Config::load(args.config.as_ref())?;
|
||||||
|
debug!("Loaded configuration: {:?}", config);
|
||||||
|
|
||||||
|
// Create default config if it doesn't exist (only in normal operation)
|
||||||
|
if args.verbose == 0 {
|
||||||
|
if let Err(e) = Config::create_default_if_not_exists() {
|
||||||
|
warn!("Failed to create default config file: {}", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine output directory (command line takes precedence over config)
|
||||||
|
let output_dir = match (&args.output, &config.output_dir) {
|
||||||
|
(Some(path), _) => path.clone(),
|
||||||
|
(None, Some(dir)) => {
|
||||||
|
info!("Using output directory from config: {}", dir);
|
||||||
|
PathBuf::from(dir)
|
||||||
|
},
|
||||||
|
(None, None) => {
|
||||||
|
let default_path = PathBuf::from("00-jira-to-gh-issues/output");
|
||||||
|
info!("Using default output directory: {}", default_path.display());
|
||||||
|
default_path
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Determine issue status (command line takes precedence over config)
|
||||||
|
let status = match (&args.status, &config.default_status) {
|
||||||
|
(Some(s), _) => s.clone(),
|
||||||
|
(None, Some(s)) => {
|
||||||
|
info!("Using default status from config: {}", s);
|
||||||
|
s.clone()
|
||||||
|
},
|
||||||
|
(None, None) => {
|
||||||
|
let default_status = "ready".to_string();
|
||||||
|
info!("Using default status: {}", default_status);
|
||||||
|
default_status
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
debug!("Input file: {}", args.input.display());
|
||||||
|
debug!("Output directory: {}", output_dir.display());
|
||||||
|
debug!("Default status: {}", status);
|
||||||
|
|
||||||
|
// Create output directory if it doesn't exist
|
||||||
|
fs::create_dir_all(&output_dir)?;
|
||||||
|
|
||||||
|
// Read CSV file
|
||||||
|
let file = File::open(&args.input)
|
||||||
|
.with_context(|| format!("Failed to open input file: {}", args.input.display()))?;
|
||||||
|
|
||||||
|
let mut rdr = ReaderBuilder::new()
|
||||||
|
.has_headers(true)
|
||||||
|
.flexible(true) // Allow for variable number of fields
|
||||||
|
.from_reader(file);
|
||||||
|
|
||||||
|
// Read headers to identify field positions
|
||||||
|
let headers = rdr.headers()?.clone();
|
||||||
|
info!("Found {} columns in CSV", headers.len());
|
||||||
|
|
||||||
|
// Check for required columns
|
||||||
|
let required_fields: Vec<&str> = config.required_fields.iter().map(|s| s.as_str()).collect();
|
||||||
|
validate_headers(&headers, &required_fields)?;
|
||||||
|
|
||||||
|
// Prepare timestamp directory for output
|
||||||
|
let timestamp = Local::now().format("%Y%m%d%H%M%S").to_string();
|
||||||
|
let batch_dir = output_dir.join(timestamp);
|
||||||
|
fs::create_dir_all(&batch_dir)?;
|
||||||
|
info!("Created output directory: {}", batch_dir.display());
|
||||||
|
|
||||||
|
// Create batch file for all issues
|
||||||
|
let mut batch_file = File::create(batch_dir.join("batch.md"))?;
|
||||||
|
writeln!(batch_file, "# Jira Issues Batch\n")?;
|
||||||
|
writeln!(batch_file, "Generated from Jira export on {}\n", Local::now().format("%Y-%m-%d %H:%M:%S"))?;
|
||||||
|
|
||||||
|
// Process each row in the CSV
|
||||||
|
let mut stats = ProcessingStats::new();
|
||||||
|
let mut error_log = File::create(batch_dir.join("error_log.txt"))?;
|
||||||
|
|
||||||
|
for (row_idx, result) in rdr.records().enumerate() {
|
||||||
|
stats.total_rows += 1;
|
||||||
|
|
||||||
|
let record = match result {
|
||||||
|
Ok(r) => r,
|
||||||
|
Err(e) => {
|
||||||
|
stats.parsing_errors += 1;
|
||||||
|
let error_msg = format!("Error reading row {}: {}\n", row_idx + 1, e);
|
||||||
|
error!("{}", error_msg.trim());
|
||||||
|
writeln!(error_log, "{}", error_msg)?;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Extract fields safely with fallback to empty string
|
||||||
|
let summary = get_field_by_name(&record, &headers, "Summary");
|
||||||
|
let issue_key = get_field_by_name(&record, &headers, "Issue key");
|
||||||
|
let issue_type = get_field_by_name(&record, &headers, "Issue Type");
|
||||||
|
let priority = get_field_by_name(&record, &headers, "Priority");
|
||||||
|
let description = get_field_by_name(&record, &headers, "Description");
|
||||||
|
let assignee = get_field_by_name(&record, &headers, "Assignee");
|
||||||
|
let reporter = get_field_by_name(&record, &headers, "Reporter");
|
||||||
|
let created = get_field_by_name(&record, &headers, "Created");
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
let mut missing_fields = Vec::new();
|
||||||
|
if summary.is_empty() { missing_fields.push("Summary"); }
|
||||||
|
if issue_key.is_empty() { missing_fields.push("Issue key"); }
|
||||||
|
|
||||||
|
if !missing_fields.is_empty() {
|
||||||
|
stats.empty_rows += 1;
|
||||||
|
for field in &missing_fields {
|
||||||
|
*stats.required_field_missing.entry(field.to_string()).or_insert(0) += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
let error_msg = format!("Row {}: Missing required fields: {}\n",
|
||||||
|
row_idx + 1, missing_fields.join(", "));
|
||||||
|
warn!("{}", error_msg.trim());
|
||||||
|
writeln!(error_log, "{}", error_msg)?;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
trace!("Processing row {}: Issue Key = {}, Summary = {}", row_idx + 1, issue_key, summary);
|
||||||
|
|
||||||
|
// Collect all labels from any "Labels" columns
|
||||||
|
let mut labels = Vec::new();
|
||||||
|
for (i, header) in headers.iter().enumerate() {
|
||||||
|
if header == "Labels" && i < record.len() {
|
||||||
|
let label = record.get(i).unwrap_or("").trim();
|
||||||
|
if !label.is_empty() {
|
||||||
|
labels.push(label.to_string());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert to GitHub issue format
|
||||||
|
let issue_content = convert_to_github_issue(
|
||||||
|
&summary, &issue_key, &issue_type, &priority,
|
||||||
|
&labels, &description, &assignee, &reporter,
|
||||||
|
&created, &status
|
||||||
|
);
|
||||||
|
|
||||||
|
// Generate a filename based on the issue key
|
||||||
|
// Sanitize issue key to create a valid filename
|
||||||
|
let safe_issue_key = sanitize_filename(issue_key);
|
||||||
|
let filename = format!("{}.md", safe_issue_key);
|
||||||
|
let filepath = batch_dir.join(&filename);
|
||||||
|
|
||||||
|
debug!("Writing issue to file: {}", filepath.display());
|
||||||
|
|
||||||
|
// Write to individual file
|
||||||
|
let mut file = File::create(&filepath)
|
||||||
|
.with_context(|| format!("Failed to create output file: {}", filepath.display()))?;
|
||||||
|
file.write_all(issue_content.as_bytes())?;
|
||||||
|
|
||||||
|
// Also append to batch file
|
||||||
|
writeln!(batch_file, "---ISSUE---\n")?;
|
||||||
|
writeln!(batch_file, "{}", issue_content)?;
|
||||||
|
|
||||||
|
stats.successful_conversions += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Print summary
|
||||||
|
stats.print_summary();
|
||||||
|
|
||||||
|
info!("Successfully converted {} Jira issues to GitHub issue format", stats.successful_conversions);
|
||||||
|
println!("\nSuccessfully converted {} Jira issues to GitHub issue format", stats.successful_conversions);
|
||||||
|
println!("Individual issue files are in: {}", batch_dir.display());
|
||||||
|
println!("Batch file is at: {}", batch_dir.join("batch.md").display());
|
||||||
|
println!("Error log is at: {}", batch_dir.join("error_log.txt").display());
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Validate CSV headers for required fields
|
||||||
|
fn validate_headers(headers: &csv::StringRecord, required_fields: &[&str]) -> Result<()> {
|
||||||
|
let mut missing = Vec::new();
|
||||||
|
|
||||||
|
for &field in required_fields {
|
||||||
|
if !headers.iter().any(|h| h == field) {
|
||||||
|
missing.push(field);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !missing.is_empty() {
|
||||||
|
return Err(anyhow!("Missing required columns in CSV: {}. Please ensure your Jira export includes these fields.",
|
||||||
|
missing.join(", ")));
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Sanitize a string to be used as a filename
|
||||||
|
fn sanitize_filename(s: &str) -> String {
|
||||||
|
s.chars()
|
||||||
|
.map(|c| if c.is_alphanumeric() || c == '-' || c == '_' || c == '.' { c } else { '_' })
|
||||||
|
.collect()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Safely get a field by column name, handling missing columns
|
||||||
|
fn get_field_by_name(record: &csv::StringRecord, headers: &csv::StringRecord, name: &str) -> String {
|
||||||
|
for (i, header) in headers.iter().enumerate() {
|
||||||
|
if header == name && i < record.len() {
|
||||||
|
return record.get(i).unwrap_or("").to_string();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
String::new()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Convert a Jira issue to GitHub issue markdown format
|
||||||
|
fn convert_to_github_issue(
|
||||||
|
summary: &str,
|
||||||
|
issue_key: &str,
|
||||||
|
issue_type: &str,
|
||||||
|
priority: &str,
|
||||||
|
labels: &[String],
|
||||||
|
description: &str,
|
||||||
|
assignee: &str,
|
||||||
|
reporter: &str,
|
||||||
|
created: &str,
|
||||||
|
default_status: &str
|
||||||
|
) -> String {
|
||||||
|
// Determine appropriate labels based on issue type and priority
|
||||||
|
let mut gh_labels = Vec::new();
|
||||||
|
|
||||||
|
// Add issue type as a label
|
||||||
|
let issue_type_label = match issue_type {
|
||||||
|
"Bug" | "Bug⚠️" => "bug",
|
||||||
|
"Feature" | "Feature🔺" => "enhancement",
|
||||||
|
"Refactor" | "Refactor♻️" => "refactor",
|
||||||
|
"Task" | "Task✔️" => "task",
|
||||||
|
_ => "other",
|
||||||
|
};
|
||||||
|
gh_labels.push(issue_type_label.to_string());
|
||||||
|
|
||||||
|
// Add priority as a label if significant
|
||||||
|
if priority.contains("High") || priority.contains("Critical") {
|
||||||
|
gh_labels.push("high-priority".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add any Jira labels
|
||||||
|
for label in labels {
|
||||||
|
for l in label.split(',') {
|
||||||
|
let trimmed = l.trim();
|
||||||
|
if !trimmed.is_empty() {
|
||||||
|
gh_labels.push(trimmed.to_string());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format labels for YAML front matter
|
||||||
|
let labels_yaml = if gh_labels.is_empty() {
|
||||||
|
"".to_string()
|
||||||
|
} else {
|
||||||
|
let mut result = "labels:\n".to_string();
|
||||||
|
for label in gh_labels {
|
||||||
|
result.push_str(&format!(" - {}\n", label));
|
||||||
|
}
|
||||||
|
result
|
||||||
|
};
|
||||||
|
|
||||||
|
// Format assignees for YAML front matter
|
||||||
|
let assignees_yaml = if assignee.is_empty() {
|
||||||
|
"".to_string()
|
||||||
|
} else {
|
||||||
|
format!("assignees:\n - {}\n", assignee)
|
||||||
|
};
|
||||||
|
|
||||||
|
// Format the description, replacing any Jira formatting with Markdown
|
||||||
|
let formatted_description = description
|
||||||
|
.replace("\\n", "\n")
|
||||||
|
.replace("{noformat}", "```")
|
||||||
|
.replace("!image-", "![image-");
|
||||||
|
|
||||||
|
// Create GitHub issue content with YAML front matter
|
||||||
|
let content = format!(
|
||||||
|
"---\ntitle: {}\nstatus: {}\n{}{}---\n\n# {}\n\n## Description\n\n{}\n\n## Metadata\n\n- **Jira Issue**: {}\n- **Created**: {}\n- **Reporter**: {}\n",
|
||||||
|
summary,
|
||||||
|
default_status,
|
||||||
|
labels_yaml,
|
||||||
|
assignees_yaml,
|
||||||
|
summary,
|
||||||
|
formatted_description,
|
||||||
|
issue_key,
|
||||||
|
created,
|
||||||
|
reporter
|
||||||
|
);
|
||||||
|
|
||||||
|
content
|
||||||
|
}
|
||||||
23
01-gh-issue-generator/.gitignore
vendored
Normal file
23
01-gh-issue-generator/.gitignore
vendored
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
# Generated by Cargo
|
||||||
|
/target/
|
||||||
|
|
||||||
|
# Remove Cargo.lock from gitignore if creating an executable, leave it for libraries
|
||||||
|
# More information here https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
|
||||||
|
Cargo.lock
|
||||||
|
|
||||||
|
# Reports generated by the tool
|
||||||
|
/report/
|
||||||
|
|
||||||
|
# These are backup files generated by rustfmt
|
||||||
|
**/*.rs.bk
|
||||||
|
|
||||||
|
# MSVC Windows builds of rustc generate these, which store debugging information
|
||||||
|
*.pdb
|
||||||
|
|
||||||
|
# macOS files
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
|
# IDE files
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.iml
|
||||||
20
01-gh-issue-generator/Cargo.toml
Normal file
20
01-gh-issue-generator/Cargo.toml
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
[package]
|
||||||
|
name = "gh-issue-generator"
|
||||||
|
version = "0.1.0"
|
||||||
|
edition = "2021"
|
||||||
|
authors = ["b3"]
|
||||||
|
description = "Tool to generate GitHub issues from markdown files"
|
||||||
|
|
||||||
|
[dependencies]
|
||||||
|
clap = { version = "4.4", features = ["derive"] }
|
||||||
|
serde = { version = "1.0", features = ["derive"] }
|
||||||
|
serde_yaml = "0.9"
|
||||||
|
chrono = "0.4"
|
||||||
|
walkdir = "2.4"
|
||||||
|
regex = "1.10"
|
||||||
|
thiserror = "1.0"
|
||||||
|
anyhow = "1.0"
|
||||||
|
log = "0.4"
|
||||||
|
env_logger = "0.10"
|
||||||
|
dirs = "5.0"
|
||||||
|
toml = "0.8"
|
||||||
425
01-gh-issue-generator/README.md
Normal file
425
01-gh-issue-generator/README.md
Normal file
@@ -0,0 +1,425 @@
|
|||||||
|
# GitHub Issue Generator
|
||||||
|
|
||||||
|
A Rust command-line tool that generates GitHub issues from Markdown files. This tool parses structured Markdown files with YAML front matter and uses the GitHub CLI to create issues.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- [GitHub CLI](https://cli.github.com/) installed and authenticated
|
||||||
|
- [Rust](https://www.rust-lang.org/tools/install) installed (cargo, rustc)
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Option 1: Build from source
|
||||||
|
|
||||||
|
1. Clone this repository
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/bee8333/2ticketss.git
|
||||||
|
cd 2ticketss/01-gh-issue-generator
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Build the application:
|
||||||
|
```bash
|
||||||
|
cargo build --release
|
||||||
|
```
|
||||||
|
|
||||||
|
3. The executable will be in `target/release/gh-issue-generator`
|
||||||
|
|
||||||
|
4. Optional: Add to your PATH
|
||||||
|
```bash
|
||||||
|
# Linux/macOS
|
||||||
|
cp target/release/gh-issue-generator ~/.local/bin/
|
||||||
|
|
||||||
|
# or add the following to your .bashrc or .zshrc
|
||||||
|
export PATH="$PATH:/path/to/2ticketss/01-gh-issue-generator/target/release"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 2: Install with Cargo
|
||||||
|
|
||||||
|
If you have Rust installed, you can install directly with:
|
||||||
|
```bash
|
||||||
|
cargo install --path .
|
||||||
|
```
|
||||||
|
|
||||||
|
### GitHub CLI Authentication
|
||||||
|
|
||||||
|
Before using this tool, you must authenticate the GitHub CLI:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Login to GitHub
|
||||||
|
gh auth login
|
||||||
|
|
||||||
|
# Verify you're authenticated
|
||||||
|
gh auth status
|
||||||
|
```
|
||||||
|
|
||||||
|
Make sure you have the appropriate scopes (repo for private repositories) during authentication.
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```bash
|
||||||
|
gh-issue-generator [OPTIONS] <input_paths>...
|
||||||
|
```
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
- `<input_paths>...`: One or more paths to Markdown files or directories containing Markdown files
|
||||||
|
|
||||||
|
Options:
|
||||||
|
- `-r, --repo <owner/repo>`: GitHub repository in format owner/repo (e.g., "octocat/Hello-World")
|
||||||
|
- `-d, --dry-run`: Run in dry-run mode (don't actually create issues)
|
||||||
|
- `-v, --verbose`: Increase verbosity level (can be used multiple times: -v, -vv)
|
||||||
|
- `-c, --config <config>`: Path to config file
|
||||||
|
|
||||||
|
### Logging Options
|
||||||
|
|
||||||
|
You can control log verbosity in two ways:
|
||||||
|
|
||||||
|
1. Using the `--verbose` flag:
|
||||||
|
- `-v`: Debug level logging
|
||||||
|
- `-vv`: Trace level logging
|
||||||
|
|
||||||
|
2. Using the `RUST_LOG` environment variable:
|
||||||
|
```bash
|
||||||
|
# Set global log level
|
||||||
|
RUST_LOG=debug gh-issue-generator --repo owner/repo issues/
|
||||||
|
|
||||||
|
# Set per-module log level
|
||||||
|
RUST_LOG=gh_issue_generator=trace,walkdir=warn gh-issue-generator --repo owner/repo issues/
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
```bash
|
||||||
|
# Process a single Markdown file
|
||||||
|
gh-issue-generator --repo myuser/myrepo issues/feature-request.md
|
||||||
|
|
||||||
|
# Process all Markdown files in a directory
|
||||||
|
gh-issue-generator --repo myuser/myrepo issues/
|
||||||
|
|
||||||
|
# Process multiple files
|
||||||
|
gh-issue-generator --repo myuser/myrepo issues/bug-1.md issues/feature-1.md
|
||||||
|
|
||||||
|
# Dry run (don't create actual issues)
|
||||||
|
gh-issue-generator --repo myuser/myrepo --dry-run issues/
|
||||||
|
|
||||||
|
# Use a specific config file
|
||||||
|
gh-issue-generator --config my-config.toml issues/
|
||||||
|
|
||||||
|
# Use increased verbosity for troubleshooting
|
||||||
|
gh-issue-generator --repo myuser/myrepo --verbose issues/
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
The tool supports configuration files to store default settings. Configuration is searched in these locations (in order):
|
||||||
|
|
||||||
|
1. Path specified with `--config` option
|
||||||
|
2. `.gh-issue-generator.toml` in current directory
|
||||||
|
3. `~/.gh-issue-generator.toml` in home directory
|
||||||
|
4. `~/.config/gh-issue-generator/config.toml` in XDG config directory
|
||||||
|
|
||||||
|
A default configuration file will be created at `~/.config/gh-issue-generator/config.toml` if none exists.
|
||||||
|
|
||||||
|
### Configuration Options
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# Default GitHub repository (optional)
|
||||||
|
# If specified, you can omit --repo command line option
|
||||||
|
default_repo = "myuser/myrepo"
|
||||||
|
|
||||||
|
# Report output directory (default: "report")
|
||||||
|
report_dir = "report"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Creating or Editing Configuration
|
||||||
|
|
||||||
|
You can manually create or edit the configuration file, or run the tool once to create a default configuration automatically.
|
||||||
|
|
||||||
|
For example, to set a default repository:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir -p ~/.config/gh-issue-generator
|
||||||
|
cat > ~/.config/gh-issue-generator/config.toml << EOF
|
||||||
|
default_repo = "myuser/myrepo"
|
||||||
|
report_dir = "github-issues/reports"
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
## Complete Workflow Examples
|
||||||
|
|
||||||
|
### Example 1: Creating Feature Requests
|
||||||
|
|
||||||
|
1. Create a feature request Markdown file:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir -p issues/features
|
||||||
|
cat > issues/features/search-feature.md << 'EOF'
|
||||||
|
---
|
||||||
|
title: Implement advanced search functionality
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- enhancement
|
||||||
|
- search
|
||||||
|
assignees:
|
||||||
|
- developer1
|
||||||
|
milestone: v2.0
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
We need to add advanced search capabilities to improve user experience.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- [ ] Support fuzzy search matching
|
||||||
|
- [ ] Allow filtering by multiple criteria
|
||||||
|
- [ ] Add search history feature
|
||||||
|
|
||||||
|
## Technical Considerations
|
||||||
|
|
||||||
|
- Consider using Elasticsearch
|
||||||
|
- Should be optimized for performance
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Create the GitHub issue:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
gh-issue-generator --repo myuser/myrepo issues/features/search-feature.md
|
||||||
|
```
|
||||||
|
|
||||||
|
### Example 2: Converting from Jira and Batch Creating
|
||||||
|
|
||||||
|
1. Convert Jira issues to GitHub format:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd ../00-jira-to-gh-issues
|
||||||
|
./target/release/jira-to-gh-issues --input sprint-tickets.csv --output ../issues
|
||||||
|
cd ..
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Create all issues at once:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd 01-gh-issue-generator
|
||||||
|
./target/release/gh-issue-generator --repo myuser/myrepo ../issues/20240405123045/batch.md
|
||||||
|
```
|
||||||
|
|
||||||
|
## Markdown Format
|
||||||
|
|
||||||
|
Each Markdown file should have YAML front matter at the beginning of the file, followed by the issue description.
|
||||||
|
|
||||||
|
The front matter must be enclosed between `---` lines and contain at least a `title` field.
|
||||||
|
|
||||||
|
### Front Matter Fields
|
||||||
|
|
||||||
|
| Field | Required | Description |
|
||||||
|
|-------|----------|-------------|
|
||||||
|
| `title` | Yes | Issue title |
|
||||||
|
| `status` | No | Issue status: "draft" or "ready" (default: "ready") |
|
||||||
|
| `labels` | No | Array of labels to apply to the issue |
|
||||||
|
| `assignees` | No | Array of GitHub usernames to assign the issue to |
|
||||||
|
| `milestone` | No | Milestone to add the issue to |
|
||||||
|
| `project` | No | GitHub project to add the issue to |
|
||||||
|
| `parent` | No | Issue number or URL of a parent issue |
|
||||||
|
|
||||||
|
### Issue Body
|
||||||
|
|
||||||
|
The content after the front matter becomes the issue body. It can contain any Markdown formatting, including:
|
||||||
|
- Task lists
|
||||||
|
- Code blocks
|
||||||
|
- Images
|
||||||
|
- Links
|
||||||
|
- Tables
|
||||||
|
- etc.
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
title: Add support for OAuth2 authentication
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- enhancement
|
||||||
|
- security
|
||||||
|
assignees:
|
||||||
|
- octocat
|
||||||
|
milestone: v1.0
|
||||||
|
project: Project Board
|
||||||
|
parent: 42
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
We need to add support for OAuth2 authentication to improve security.
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
- [ ] Research OAuth2 providers
|
||||||
|
- [ ] Implement OAuth2 flow
|
||||||
|
- [ ] Add tests
|
||||||
|
- [ ] Update documentation
|
||||||
|
|
||||||
|
## Additional Context
|
||||||
|
|
||||||
|
This is needed for compliance with our security requirements.
|
||||||
|
```
|
||||||
|
|
||||||
|
### Draft Issues
|
||||||
|
|
||||||
|
Issues marked with `status: draft` will not be created in GitHub. They will be reported as skipped in the summary.
|
||||||
|
|
||||||
|
## Batch Files
|
||||||
|
|
||||||
|
The tool also supports processing multiple issues from a single file. This is useful for creating related issues in a single operation.
|
||||||
|
|
||||||
|
### Batch File Format
|
||||||
|
|
||||||
|
Batch files should:
|
||||||
|
1. Start with any introductory content (optional)
|
||||||
|
2. Use the delimiter `---ISSUE---` to separate individual issues
|
||||||
|
3. Each issue section must include its own YAML front matter and body
|
||||||
|
|
||||||
|
### Example Batch File
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Sprint Planning Issues
|
||||||
|
|
||||||
|
This file contains all issues for the upcoming sprint.
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Implement login feature
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Add user login functionality...
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Update homepage design
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- ui
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Refresh the homepage design...
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Database migration planning
|
||||||
|
status: draft
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Plan the database migration...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Processing Batch Files
|
||||||
|
|
||||||
|
Process a batch file the same way as a regular file:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
gh-issue-generator --repo myuser/myrepo sprint-planning.md
|
||||||
|
```
|
||||||
|
|
||||||
|
The tool will create each non-draft issue as a separate GitHub issue.
|
||||||
|
|
||||||
|
## Report Generation
|
||||||
|
|
||||||
|
After processing all input files, the tool creates a timestamped report directory in `report/YYYY-MM-DD-HH-MM/`. The report contains:
|
||||||
|
|
||||||
|
1. A summary.md file with details of all processed issues
|
||||||
|
2. Copies of all processed Markdown files
|
||||||
|
- Files for successfully created issues will have a comment with the issue URL
|
||||||
|
- Files for failed issues will have a comment with the error message
|
||||||
|
|
||||||
|
## Parent-Child Relationship
|
||||||
|
|
||||||
|
If an issue specifies a `parent` field, the tool will add a comment to the created issue linking it to the parent issue. The parent can be specified as:
|
||||||
|
- A GitHub issue number (e.g., `42`)
|
||||||
|
- A full GitHub issue URL (e.g., `https://github.com/owner/repo/issues/42`)
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### GitHub CLI Authentication Issues
|
||||||
|
|
||||||
|
If you encounter authentication issues:
|
||||||
|
```bash
|
||||||
|
# Check your GitHub CLI authentication status
|
||||||
|
gh auth status
|
||||||
|
|
||||||
|
# Re-authenticate if needed
|
||||||
|
gh auth login
|
||||||
|
```
|
||||||
|
|
||||||
|
### Permission Issues
|
||||||
|
|
||||||
|
Ensure your GitHub user has permission to create issues in the target repository.
|
||||||
|
|
||||||
|
### Rate Limiting
|
||||||
|
|
||||||
|
If you're creating many issues, you might hit GitHub's API rate limits. The tool doesn't currently implement rate limit handling, so you might need to wait before running again.
|
||||||
|
|
||||||
|
### Common Errors
|
||||||
|
|
||||||
|
| Error | Possible Solution |
|
||||||
|
|-------|------------------|
|
||||||
|
| `No such file or directory` | Check the file paths you're providing |
|
||||||
|
| `Failed to parse YAML front matter` | Ensure your YAML is properly formatted |
|
||||||
|
| `failed to fetch resource: ...` | Check your network connection and GitHub authentication |
|
||||||
|
| `Repository not found` | Verify the repository exists and you have access to it |
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
- Use `--dry-run` to test your issue files before creating actual issues
|
||||||
|
- Organize your issue files in a logical directory structure
|
||||||
|
- Use consistent naming conventions for your issue files
|
||||||
|
- Include detailed information in your issue descriptions to minimize follow-up questions
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT
|
||||||
|
|
||||||
|
## AI-Assisted Issue Generation
|
||||||
|
|
||||||
|
You can use agentic assistants like Claude Projects or Cursor IDE to help generate issue markdown files. Here are template prompts you can use:
|
||||||
|
|
||||||
|
### Prompt for Generating a Single Issue
|
||||||
|
|
||||||
|
```
|
||||||
|
Create a GitHub issue markdown file with YAML front matter for the following project feature:
|
||||||
|
|
||||||
|
Feature description: [your feature description]
|
||||||
|
Key requirements:
|
||||||
|
- [requirement 1]
|
||||||
|
- [requirement 2]
|
||||||
|
|
||||||
|
The issue should include:
|
||||||
|
- A clear title
|
||||||
|
- Appropriate labels (choose from: bug, enhancement, documentation, feature, refactor)
|
||||||
|
- Detailed description
|
||||||
|
- Acceptance criteria
|
||||||
|
- Any technical considerations
|
||||||
|
```
|
||||||
|
|
||||||
|
### Prompt for Generating a Batch of Related Issues
|
||||||
|
|
||||||
|
```
|
||||||
|
Create a batch file containing GitHub issues for a sprint focused on [feature area].
|
||||||
|
|
||||||
|
Include 3-5 related issues that would be needed to implement this feature area.
|
||||||
|
Each issue should have appropriate front matter with title, labels, etc.
|
||||||
|
Make sure the issues are properly separated with the ---ISSUE--- delimiter.
|
||||||
|
|
||||||
|
The issues should cover different aspects like:
|
||||||
|
- Initial implementation
|
||||||
|
- UI/UX improvements
|
||||||
|
- Testing
|
||||||
|
- Documentation
|
||||||
|
```
|
||||||
55
01-gh-issue-generator/SAMPLE_DRAFT_ISSUE.md
Normal file
55
01-gh-issue-generator/SAMPLE_DRAFT_ISSUE.md
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
---
|
||||||
|
title: Implement payment gateway integration
|
||||||
|
status: draft
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- payments
|
||||||
|
- backend
|
||||||
|
assignees:
|
||||||
|
- developer3
|
||||||
|
milestone: v1.1
|
||||||
|
project: Development Roadmap
|
||||||
|
---
|
||||||
|
|
||||||
|
# Payment Gateway Integration
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
We need to integrate with a payment gateway to handle subscription and one-time payments.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- Support for credit card payments
|
||||||
|
- Subscription management
|
||||||
|
- Invoicing
|
||||||
|
- Webhooks for payment events
|
||||||
|
- Handling failed payments and retries
|
||||||
|
|
||||||
|
## Options to Evaluate
|
||||||
|
|
||||||
|
- [ ] Stripe
|
||||||
|
- [ ] PayPal
|
||||||
|
- [ ] Braintree
|
||||||
|
- [ ] Square
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
This issue is currently in draft because we need to:
|
||||||
|
1. Finalize which payment provider to use
|
||||||
|
2. Determine pricing tiers
|
||||||
|
3. Get legal approval for terms of service
|
||||||
|
|
||||||
|
## Tasks (Preliminary)
|
||||||
|
|
||||||
|
- [ ] Research payment gateways and their fees
|
||||||
|
- [ ] Create comparison matrix of features
|
||||||
|
- [ ] Discuss with legal team about compliance requirements
|
||||||
|
- [ ] Draft initial integration architecture
|
||||||
|
- [ ] Create test accounts with potential providers
|
||||||
|
|
||||||
|
## Questions to Answer
|
||||||
|
|
||||||
|
- What are our transaction volume estimates?
|
||||||
|
- Do we need international payment support?
|
||||||
|
- What currencies do we need to support?
|
||||||
|
- Are there specific compliance requirements (PCI-DSS)?
|
||||||
67
01-gh-issue-generator/SAMPLE_ISSUE.md
Normal file
67
01-gh-issue-generator/SAMPLE_ISSUE.md
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
---
|
||||||
|
title: Implement user authentication service
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- security
|
||||||
|
- backend
|
||||||
|
assignees:
|
||||||
|
- developer1
|
||||||
|
- developer2
|
||||||
|
milestone: v1.0
|
||||||
|
project: Development Roadmap
|
||||||
|
parent: 42
|
||||||
|
---
|
||||||
|
|
||||||
|
# User Authentication Service Implementation
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
We need to implement a comprehensive user authentication service that handles registration, login, password reset, and account management.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- Secure password handling with bcrypt
|
||||||
|
- JWT token generation and validation
|
||||||
|
- Email verification flow
|
||||||
|
- Two-factor authentication support
|
||||||
|
- Rate limiting for login attempts
|
||||||
|
- Session management
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
- [ ] Design authentication database schema
|
||||||
|
- [ ] Implement user registration endpoint
|
||||||
|
- [ ] Add email verification flow
|
||||||
|
- [ ] Create login endpoint with JWT token generation
|
||||||
|
- [ ] Implement password reset functionality
|
||||||
|
- [ ] Add two-factor authentication
|
||||||
|
- [ ] Set up rate limiting for login attempts
|
||||||
|
- [ ] Implement session management
|
||||||
|
- [ ] Write unit and integration tests
|
||||||
|
- [ ] Create API documentation
|
||||||
|
|
||||||
|
## Technical Notes
|
||||||
|
|
||||||
|
```
|
||||||
|
POST /api/auth/register
|
||||||
|
{
|
||||||
|
"email": "user@example.com",
|
||||||
|
"password": "securePassword",
|
||||||
|
"name": "User Name"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Authentication should follow OAuth 2.0 standards where applicable.
|
||||||
|
|
||||||
|
## Related Resources
|
||||||
|
|
||||||
|
- [JWT Best Practices](https://auth0.com/blog/a-look-at-the-latest-draft-for-jwt-bcp/)
|
||||||
|
- [OWASP Authentication Cheat Sheet](https://cheatsheetseries.owasp.org/cheatsheets/Authentication_Cheat_Sheet.html)
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
1. All authentication endpoints pass security review
|
||||||
|
2. Password storage follows current best practices
|
||||||
|
3. Login and registration flows have appropriate rate limiting
|
||||||
|
4. All user flows are documented in the API docs
|
||||||
127
01-gh-issue-generator/examples/batch.md
Normal file
127
01-gh-issue-generator/examples/batch.md
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
# Batch Issues Example
|
||||||
|
|
||||||
|
This file demonstrates how to use our tool with a single batch file containing multiple issues.
|
||||||
|
To use this approach, save this as a markdown file and run:
|
||||||
|
|
||||||
|
```
|
||||||
|
gh-issue-generator --repo owner/repo batch.md
|
||||||
|
```
|
||||||
|
|
||||||
|
Note: Each issue must be separated by the delimiter `---ISSUE---`
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Implement user registration flow
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- frontend
|
||||||
|
- backend
|
||||||
|
assignees:
|
||||||
|
- frontend-dev
|
||||||
|
- backend-dev
|
||||||
|
milestone: v1.0
|
||||||
|
project: User Management
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
Create a complete user registration flow including email verification.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- Registration form with validation
|
||||||
|
- Email verification process
|
||||||
|
- Welcome email
|
||||||
|
- Database schema updates
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
- [ ] Design registration form
|
||||||
|
- [ ] Implement frontend validation
|
||||||
|
- [ ] Create backend API endpoints
|
||||||
|
- [ ] Set up email verification system
|
||||||
|
- [ ] Update user database schema
|
||||||
|
- [ ] Add welcome email template
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Add product search functionality
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- search
|
||||||
|
milestone: v1.0
|
||||||
|
project: Product Catalog
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
Implement search functionality for the product catalog.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- Search by product name, description, and tags
|
||||||
|
- Autocomplete suggestions
|
||||||
|
- Filter by category and price range
|
||||||
|
- Sort results by relevance, price, or rating
|
||||||
|
|
||||||
|
## Technical Approach
|
||||||
|
|
||||||
|
Use Elasticsearch for the search index and implement a React component for the frontend.
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Optimize image loading performance
|
||||||
|
status: draft
|
||||||
|
labels:
|
||||||
|
- performance
|
||||||
|
- frontend
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
Optimize image loading to improve page performance and Core Web Vitals metrics.
|
||||||
|
|
||||||
|
## Ideas to Consider
|
||||||
|
|
||||||
|
- Implement lazy loading for images
|
||||||
|
- Use responsive images with srcset
|
||||||
|
- Add image compression pipeline
|
||||||
|
- Consider using a CDN for image delivery
|
||||||
|
|
||||||
|
This is currently in draft because we need to gather performance metrics first.
|
||||||
|
|
||||||
|
---ISSUE---
|
||||||
|
|
||||||
|
---
|
||||||
|
title: Update third-party dependencies
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- maintenance
|
||||||
|
- security
|
||||||
|
assignees:
|
||||||
|
- devops-team
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
Update all third-party dependencies to their latest versions to address security vulnerabilities.
|
||||||
|
|
||||||
|
## Dependencies to Update
|
||||||
|
|
||||||
|
- React and related packages
|
||||||
|
- Backend frameworks
|
||||||
|
- Database drivers
|
||||||
|
- Testing libraries
|
||||||
|
|
||||||
|
## Steps
|
||||||
|
|
||||||
|
1. Review current dependencies
|
||||||
|
2. Check for security advisories
|
||||||
|
3. Create upgrade plan
|
||||||
|
4. Test upgrades in staging
|
||||||
|
5. Deploy to production
|
||||||
57
01-gh-issue-generator/examples/issues/bugs/login-error.md
Normal file
57
01-gh-issue-generator/examples/issues/bugs/login-error.md
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
---
|
||||||
|
title: Fix login error with special characters in password
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- bug
|
||||||
|
- security
|
||||||
|
- critical
|
||||||
|
assignees:
|
||||||
|
- backend-dev
|
||||||
|
milestone: v1.0.1
|
||||||
|
project: Bug Fixes
|
||||||
|
parent: 123
|
||||||
|
---
|
||||||
|
|
||||||
|
# Login Error with Special Characters in Password
|
||||||
|
|
||||||
|
## Bug Description
|
||||||
|
|
||||||
|
Users are unable to log in when their password contains certain special characters (specifically `#`, `&`, and `%`). The login form submits successfully but returns a 400 error.
|
||||||
|
|
||||||
|
## Steps to Reproduce
|
||||||
|
|
||||||
|
1. Create a user account with a password containing one of the special characters: `#`, `&`, or `%`
|
||||||
|
2. Log out of the account
|
||||||
|
3. Attempt to log in with the correct credentials
|
||||||
|
4. Observe the error message and failed login
|
||||||
|
|
||||||
|
## Expected Behavior
|
||||||
|
|
||||||
|
Login should succeed with the correct credentials regardless of special characters in the password.
|
||||||
|
|
||||||
|
## Actual Behavior
|
||||||
|
|
||||||
|
When submitting the login form with credentials containing special characters, the request fails with a 400 Bad Request error. The following error appears in the console:
|
||||||
|
|
||||||
|
```
|
||||||
|
POST https://api.example.com/auth/login 400 (Bad Request)
|
||||||
|
Error: {"error":"Invalid request parameters"}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment
|
||||||
|
|
||||||
|
- **Browser**: Chrome 98.0.4758.102, Firefox 97.0.1
|
||||||
|
- **OS**: Windows 10, macOS Monterey 12.2.1
|
||||||
|
- **Backend Version**: v1.2.3
|
||||||
|
|
||||||
|
## Technical Analysis
|
||||||
|
|
||||||
|
Initial investigation suggests that the password is not being properly URL-encoded before being sent to the backend, causing the server to reject the request.
|
||||||
|
|
||||||
|
## Possible Fix
|
||||||
|
|
||||||
|
Add proper URL encoding to the login form submission or update the backend to handle special characters in the request payload correctly.
|
||||||
|
|
||||||
|
## Severity
|
||||||
|
|
||||||
|
Critical - Affects user authentication and prevents access to the application.
|
||||||
51
01-gh-issue-generator/examples/issues/feature.md
Normal file
51
01-gh-issue-generator/examples/issues/feature.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
---
|
||||||
|
title: Add dark mode support
|
||||||
|
status: ready
|
||||||
|
labels:
|
||||||
|
- enhancement
|
||||||
|
- ui
|
||||||
|
- accessibility
|
||||||
|
assignees:
|
||||||
|
- ui-designer
|
||||||
|
- frontend-dev
|
||||||
|
milestone: v2.0
|
||||||
|
project: UI Improvements
|
||||||
|
---
|
||||||
|
|
||||||
|
# Dark Mode Support
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
We need to add a dark mode option to our application to improve accessibility and user experience in low-light environments.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- Toggle switch in user settings
|
||||||
|
- System preference detection (respect user's OS preference)
|
||||||
|
- Persistent setting (remember user's choice)
|
||||||
|
- Properly themed components for all UI elements
|
||||||
|
- Appropriate contrast ratios for accessibility compliance
|
||||||
|
|
||||||
|
## Tasks
|
||||||
|
|
||||||
|
- [ ] Create dark color palette
|
||||||
|
- [ ] Add theme toggle component in settings
|
||||||
|
- [ ] Set up theme context/provider
|
||||||
|
- [ ] Implement system preference detection
|
||||||
|
- [ ] Refactor components to use theme variables
|
||||||
|
- [ ] Add theme persistence in user settings
|
||||||
|
- [ ] Test across all major browsers and devices
|
||||||
|
- [ ] Verify accessibility compliance
|
||||||
|
|
||||||
|
## Technical Approach
|
||||||
|
|
||||||
|
We should use CSS variables and a theme provider context to implement the theming system. This will allow us to switch themes without requiring a page reload.
|
||||||
|
|
||||||
|
## Resources
|
||||||
|
|
||||||
|
- [WCAG Contrast Guidelines](https://www.w3.org/WAI/WCAG21/Understanding/contrast-minimum.html)
|
||||||
|
- [Material Design Dark Theme](https://material.io/design/color/dark-theme.html)
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
The main challenges will be ensuring all third-party components also respect the theme, and making sure we maintain proper contrast ratios in both themes.
|
||||||
17140
01-gh-issue-generator/examples/jira-import.md
Normal file
17140
01-gh-issue-generator/examples/jira-import.md
Normal file
File diff suppressed because it is too large
Load Diff
845
01-gh-issue-generator/src/main.rs
Normal file
845
01-gh-issue-generator/src/main.rs
Normal file
@@ -0,0 +1,845 @@
|
|||||||
|
use std::path::{Path, PathBuf};
|
||||||
|
use std::fs::{self, File, create_dir_all};
|
||||||
|
use std::io::{Write, Read};
|
||||||
|
use std::process::Command;
|
||||||
|
|
||||||
|
use anyhow::{Result, Context};
|
||||||
|
use chrono::Local;
|
||||||
|
use clap::Parser;
|
||||||
|
use dirs::home_dir;
|
||||||
|
use log::{info, warn, error, debug, trace, LevelFilter};
|
||||||
|
use regex::Regex;
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use walkdir::WalkDir;
|
||||||
|
|
||||||
|
/// GitHub Issue Generator from Markdown files
|
||||||
|
///
|
||||||
|
/// A tool that generates GitHub issues from structured Markdown files.
|
||||||
|
/// It parses files with YAML front matter to extract metadata like title,
|
||||||
|
/// labels, assignees, and other issue properties, then uses the GitHub CLI
|
||||||
|
/// to create issues with the specified attributes.
|
||||||
|
#[derive(Parser, Debug)]
|
||||||
|
#[command(
|
||||||
|
author,
|
||||||
|
version,
|
||||||
|
about,
|
||||||
|
long_about = "A command-line utility that creates GitHub issues from structured Markdown files.
|
||||||
|
It parses Markdown files with YAML front matter and uses the GitHub CLI to create issues.
|
||||||
|
|
||||||
|
The tool supports:
|
||||||
|
- Individual issue files with metadata like title, labels, assignees
|
||||||
|
- Batch files containing multiple issues
|
||||||
|
- Parent-child issue relationships
|
||||||
|
- GitHub project integration
|
||||||
|
- Draft issues (skipped during creation)
|
||||||
|
- Detailed reporting
|
||||||
|
|
||||||
|
Requirements:
|
||||||
|
- GitHub CLI (gh) must be installed and authenticated
|
||||||
|
- Repository must exist and user must have issue creation permissions"
|
||||||
|
)]
|
||||||
|
struct Args {
|
||||||
|
/// GitHub repository in format owner/repo
|
||||||
|
///
|
||||||
|
/// The target GitHub repository where issues will be created.
|
||||||
|
/// Format must be 'owner/repo' (e.g., 'octocat/Hello-World').
|
||||||
|
/// This can also be set in the configuration file.
|
||||||
|
#[arg(short, long)]
|
||||||
|
repo: Option<String>,
|
||||||
|
|
||||||
|
/// Path to markdown file, directory, or multiple file paths
|
||||||
|
///
|
||||||
|
/// One or more paths to:
|
||||||
|
/// - Individual markdown files
|
||||||
|
/// - Directories containing markdown files
|
||||||
|
/// - Batch files with multiple issues
|
||||||
|
///
|
||||||
|
/// All files must have .md or .markdown extension.
|
||||||
|
#[arg(required = true)]
|
||||||
|
input_paths: Vec<PathBuf>,
|
||||||
|
|
||||||
|
/// Dry run (don't actually create issues)
|
||||||
|
///
|
||||||
|
/// When enabled, the tool will parse files and show what would be created,
|
||||||
|
/// but won't actually create any issues on GitHub. Useful for testing
|
||||||
|
/// file formatting and validating inputs before actual creation.
|
||||||
|
#[arg(short, long)]
|
||||||
|
dry_run: bool,
|
||||||
|
|
||||||
|
/// Verbosity level (can be used multiple times)
|
||||||
|
///
|
||||||
|
/// Controls the amount of information displayed during execution:
|
||||||
|
/// -v: Debug level (detailed information for troubleshooting)
|
||||||
|
/// -vv: Trace level (very verbose, all operations logged)
|
||||||
|
///
|
||||||
|
/// You can also use the RUST_LOG environment variable instead.
|
||||||
|
#[arg(short, long, action = clap::ArgAction::Count)]
|
||||||
|
verbose: u8,
|
||||||
|
|
||||||
|
/// Path to config file
|
||||||
|
///
|
||||||
|
/// Path to a TOML configuration file. If not specified, the tool will
|
||||||
|
/// search for configuration in standard locations:
|
||||||
|
/// - .gh-issue-generator.toml in current directory
|
||||||
|
/// - ~/.gh-issue-generator.toml in home directory
|
||||||
|
/// - ~/.config/gh-issue-generator/config.toml
|
||||||
|
#[arg(short = 'c', long)]
|
||||||
|
config: Option<PathBuf>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
struct Config {
|
||||||
|
/// Default GitHub repository in format owner/repo
|
||||||
|
default_repo: Option<String>,
|
||||||
|
|
||||||
|
/// Default report output directory
|
||||||
|
#[serde(default = "default_report_dir")]
|
||||||
|
report_dir: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_report_dir() -> String {
|
||||||
|
"report".to_string()
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Default for Config {
|
||||||
|
fn default() -> Self {
|
||||||
|
Self {
|
||||||
|
default_repo: None,
|
||||||
|
report_dir: default_report_dir(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Config {
|
||||||
|
/// Load configuration from file or create default if not exists
|
||||||
|
fn load(config_path: Option<&PathBuf>) -> Result<Self> {
|
||||||
|
// If path is provided, try to load from there
|
||||||
|
if let Some(path) = config_path {
|
||||||
|
debug!("Loading config from specified path: {}", path.display());
|
||||||
|
return Self::load_from_file(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to load from default locations
|
||||||
|
let config_paths = [
|
||||||
|
// Current directory
|
||||||
|
PathBuf::from(".gh-issue-generator.toml"),
|
||||||
|
// Home directory
|
||||||
|
home_dir().map(|h| h.join(".gh-issue-generator.toml")).unwrap_or_default(),
|
||||||
|
// XDG config directory
|
||||||
|
home_dir().map(|h| h.join(".config/gh-issue-generator/config.toml")).unwrap_or_default(),
|
||||||
|
];
|
||||||
|
|
||||||
|
for path in &config_paths {
|
||||||
|
if path.exists() {
|
||||||
|
debug!("Found config file at: {}", path.display());
|
||||||
|
return Self::load_from_file(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// No config file found, create default in home directory
|
||||||
|
debug!("No config file found, using defaults");
|
||||||
|
Ok(Self::default())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Load configuration from a specific file
|
||||||
|
fn load_from_file(path: &Path) -> Result<Self> {
|
||||||
|
let mut file = File::open(path)
|
||||||
|
.context(format!("Failed to open config file: {}", path.display()))?;
|
||||||
|
|
||||||
|
let mut content = String::new();
|
||||||
|
file.read_to_string(&mut content)
|
||||||
|
.context("Failed to read config file")?;
|
||||||
|
|
||||||
|
toml::from_str(&content)
|
||||||
|
.context("Failed to parse config file as TOML")
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Save configuration to file
|
||||||
|
fn save(&self, path: &Path) -> Result<()> {
|
||||||
|
debug!("Saving config to: {}", path.display());
|
||||||
|
|
||||||
|
// Create parent directories if they don't exist
|
||||||
|
if let Some(parent) = path.parent() {
|
||||||
|
create_dir_all(parent)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
let content = toml::to_string_pretty(self)
|
||||||
|
.context("Failed to serialize config to TOML")?;
|
||||||
|
|
||||||
|
let mut file = File::create(path)
|
||||||
|
.context(format!("Failed to create config file: {}", path.display()))?;
|
||||||
|
|
||||||
|
file.write_all(content.as_bytes())
|
||||||
|
.context("Failed to write config file")?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Create a default config file if it doesn't exist
|
||||||
|
fn create_default_if_not_exists() -> Result<()> {
|
||||||
|
let default_path = home_dir()
|
||||||
|
.map(|h| h.join(".config/gh-issue-generator/config.toml"))
|
||||||
|
.ok_or_else(|| anyhow::anyhow!("Failed to determine home directory"))?;
|
||||||
|
|
||||||
|
if !default_path.exists() {
|
||||||
|
debug!("Creating default config file at: {}", default_path.display());
|
||||||
|
let default_config = Self::default();
|
||||||
|
default_config.save(&default_path)?;
|
||||||
|
info!("Created default config file at: {}", default_path.display());
|
||||||
|
println!("Created default config file at: {}", default_path.display());
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Issue front matter metadata
|
||||||
|
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||||
|
struct IssueMeta {
|
||||||
|
/// Issue title
|
||||||
|
title: String,
|
||||||
|
|
||||||
|
/// Status (draft, ready)
|
||||||
|
#[serde(default = "default_status")]
|
||||||
|
status: String,
|
||||||
|
|
||||||
|
/// GitHub project (number or URL)
|
||||||
|
#[serde(default)]
|
||||||
|
project: Option<String>,
|
||||||
|
|
||||||
|
/// Issue labels
|
||||||
|
#[serde(default)]
|
||||||
|
labels: Vec<String>,
|
||||||
|
|
||||||
|
/// Issue assignees
|
||||||
|
#[serde(default)]
|
||||||
|
assignees: Vec<String>,
|
||||||
|
|
||||||
|
/// Parent issue number or URL
|
||||||
|
#[serde(default)]
|
||||||
|
parent: Option<String>,
|
||||||
|
|
||||||
|
/// Issue milestone
|
||||||
|
#[serde(default)]
|
||||||
|
milestone: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_status() -> String {
|
||||||
|
"ready".to_string()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Result of issue creation
|
||||||
|
#[derive(Debug)]
|
||||||
|
struct IssueResult {
|
||||||
|
filepath: PathBuf,
|
||||||
|
filename: String,
|
||||||
|
meta: IssueMeta,
|
||||||
|
success: bool,
|
||||||
|
issue_url: Option<String>,
|
||||||
|
error: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Validate repository format (owner/repo)
|
||||||
|
fn validate_repository_format(repo: &str) -> Result<()> {
|
||||||
|
debug!("Validating repository format: {}", repo);
|
||||||
|
|
||||||
|
// Check if the repository follows the owner/repo format
|
||||||
|
let repo_regex = Regex::new(r"^[a-zA-Z0-9_.-]+/[a-zA-Z0-9_.-]+$")?;
|
||||||
|
|
||||||
|
if !repo_regex.is_match(repo) {
|
||||||
|
let error_msg = format!("Invalid repository format: '{}'. Expected format: 'owner/repo'", repo);
|
||||||
|
error!("{}", error_msg);
|
||||||
|
return Err(anyhow::anyhow!(error_msg));
|
||||||
|
}
|
||||||
|
|
||||||
|
debug!("Repository format is valid");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn main() -> Result<()> {
|
||||||
|
let args = Args::parse();
|
||||||
|
|
||||||
|
// Initialize logger with appropriate level
|
||||||
|
let log_level = match args.verbose {
|
||||||
|
0 => LevelFilter::Info,
|
||||||
|
1 => LevelFilter::Debug,
|
||||||
|
_ => LevelFilter::Trace,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Initialize the logger, respecting the RUST_LOG env var if set
|
||||||
|
let mut builder = env_logger::Builder::new();
|
||||||
|
|
||||||
|
// If RUST_LOG is set, use that config; otherwise use our default based on -v flags
|
||||||
|
match std::env::var("RUST_LOG") {
|
||||||
|
Ok(_) => {
|
||||||
|
// RUST_LOG is set, initialize with default env_logger behavior
|
||||||
|
builder.init();
|
||||||
|
},
|
||||||
|
Err(_) => {
|
||||||
|
// RUST_LOG not set, use our verbosity flag
|
||||||
|
builder
|
||||||
|
.filter_level(log_level)
|
||||||
|
.format_timestamp(Some(env_logger::fmt::TimestampPrecision::Seconds))
|
||||||
|
.init();
|
||||||
|
|
||||||
|
// Log a hint about RUST_LOG for users who use -v
|
||||||
|
if args.verbose > 0 {
|
||||||
|
debug!("Tip: You can also control log level with the RUST_LOG environment variable");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
info!("Starting GitHub Issue Generator");
|
||||||
|
|
||||||
|
// Load configuration
|
||||||
|
let config = Config::load(args.config.as_ref())?;
|
||||||
|
debug!("Loaded configuration: {:?}", config);
|
||||||
|
|
||||||
|
// Create default config if it doesn't exist (only in normal operation)
|
||||||
|
if args.verbose == 0 {
|
||||||
|
if let Err(e) = Config::create_default_if_not_exists() {
|
||||||
|
warn!("Failed to create default config file: {}", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine the repository to use (command line takes precedence over config)
|
||||||
|
let repo = match (args.repo, &config.default_repo) {
|
||||||
|
(Some(r), _) => r,
|
||||||
|
(None, Some(r)) => {
|
||||||
|
info!("Using repository from config: {}", r);
|
||||||
|
r.clone()
|
||||||
|
},
|
||||||
|
(None, None) => {
|
||||||
|
let error_msg = "No repository specified. Use --repo option or set default_repo in config file.";
|
||||||
|
error!("{}", error_msg);
|
||||||
|
return Err(anyhow::anyhow!(error_msg));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
debug!("Using repository: {}", repo);
|
||||||
|
debug!("Dry run: {}", args.dry_run);
|
||||||
|
debug!("Input paths: {:?}", args.input_paths);
|
||||||
|
|
||||||
|
// Validate repository format
|
||||||
|
validate_repository_format(&repo)?;
|
||||||
|
|
||||||
|
// Check if gh CLI is installed
|
||||||
|
check_gh_cli()?;
|
||||||
|
|
||||||
|
// Get list of all files to process
|
||||||
|
let files = collect_markdown_files(&args.input_paths)?;
|
||||||
|
info!("Found {} markdown files to process", files.len());
|
||||||
|
|
||||||
|
// Process each file
|
||||||
|
let mut results = Vec::new();
|
||||||
|
for file in files {
|
||||||
|
match process_file(&file, &repo, args.dry_run) {
|
||||||
|
Ok(mut file_results) => results.append(&mut file_results),
|
||||||
|
Err(e) => {
|
||||||
|
error!("Error processing file {}: {}", file.display(), e);
|
||||||
|
eprintln!("Error processing file {}: {}", file.display(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create report directory
|
||||||
|
let timestamp = Local::now().format("%Y-%m-%d-%H-%M").to_string();
|
||||||
|
let report_dir = PathBuf::from(format!("{}/{}", config.report_dir, timestamp));
|
||||||
|
create_dir_all(&report_dir)?;
|
||||||
|
debug!("Created report directory: {}", report_dir.display());
|
||||||
|
|
||||||
|
// Write report
|
||||||
|
write_report(&report_dir, &results)?;
|
||||||
|
|
||||||
|
// Print summary
|
||||||
|
let success_count = results.iter().filter(|r| r.success).count();
|
||||||
|
info!("Summary: Successfully created {}/{} issues", success_count, results.len());
|
||||||
|
println!("\nSummary: Successfully created {}/{} issues", success_count, results.len());
|
||||||
|
println!("Report saved to {}", report_dir.display());
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Check if GitHub CLI is installed and authenticated
|
||||||
|
fn check_gh_cli() -> Result<()> {
|
||||||
|
debug!("Checking if GitHub CLI is installed and authenticated");
|
||||||
|
|
||||||
|
let output = Command::new("gh")
|
||||||
|
.arg("--version")
|
||||||
|
.output()
|
||||||
|
.context("Failed to execute 'gh'. Is GitHub CLI installed?")?;
|
||||||
|
|
||||||
|
if !output.status.success() {
|
||||||
|
let error_msg = "GitHub CLI check failed. Please install the GitHub CLI ('gh') and authenticate.";
|
||||||
|
error!("{}", error_msg);
|
||||||
|
anyhow::bail!(error_msg);
|
||||||
|
}
|
||||||
|
|
||||||
|
debug!("GitHub CLI is available");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Collect all markdown files from the provided paths
|
||||||
|
fn collect_markdown_files(paths: &[PathBuf]) -> Result<Vec<PathBuf>> {
|
||||||
|
debug!("Collecting markdown files from input paths");
|
||||||
|
let mut files = Vec::new();
|
||||||
|
|
||||||
|
for path in paths {
|
||||||
|
if path.is_file() && is_markdown_file(path) {
|
||||||
|
debug!("Adding file: {}", path.display());
|
||||||
|
files.push(path.clone());
|
||||||
|
} else if path.is_dir() {
|
||||||
|
debug!("Scanning directory: {}", path.display());
|
||||||
|
for entry in WalkDir::new(path).into_iter().filter_map(|e| e.ok()) {
|
||||||
|
let entry_path = entry.path();
|
||||||
|
if entry_path.is_file() && is_markdown_file(entry_path) {
|
||||||
|
trace!("Found markdown file: {}", entry_path.display());
|
||||||
|
files.push(entry_path.to_path_buf());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(files)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Check if file is a markdown file
|
||||||
|
fn is_markdown_file(path: &Path) -> bool {
|
||||||
|
path.extension()
|
||||||
|
.map(|ext| ext.eq_ignore_ascii_case("md") || ext.eq_ignore_ascii_case("markdown"))
|
||||||
|
.unwrap_or(false)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Process a single markdown file, which may contain multiple issues
|
||||||
|
fn process_file(filepath: &Path, repo: &str, dry_run: bool) -> Result<Vec<IssueResult>> {
|
||||||
|
info!("Processing file: {}", filepath.display());
|
||||||
|
|
||||||
|
// Read file content
|
||||||
|
let content = fs::read_to_string(filepath)?;
|
||||||
|
debug!("File read successfully, content length: {} bytes", content.len());
|
||||||
|
|
||||||
|
// Check if this is a batch file with multiple issues
|
||||||
|
if content.contains("---ISSUE---") {
|
||||||
|
process_batch_file(filepath, &content, repo, dry_run)
|
||||||
|
} else {
|
||||||
|
// Process as a single issue file
|
||||||
|
debug!("Processing as single issue file");
|
||||||
|
match process_single_issue(filepath, &content, repo, dry_run) {
|
||||||
|
Ok(result) => Ok(vec![result]),
|
||||||
|
Err(e) => {
|
||||||
|
error!("Failed to process single issue file: {}", e);
|
||||||
|
Err(e)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Process a batch file containing multiple issues
|
||||||
|
fn process_batch_file(filepath: &Path, content: &str, repo: &str, dry_run: bool) -> Result<Vec<IssueResult>> {
|
||||||
|
info!("Processing as batch file with multiple issues");
|
||||||
|
|
||||||
|
// Split content by issue delimiter
|
||||||
|
let filename = filepath.file_name().unwrap().to_string_lossy().to_string();
|
||||||
|
let issues: Vec<&str> = content.split("---ISSUE---").skip(1).collect();
|
||||||
|
info!("Found {} issues in batch file", issues.len());
|
||||||
|
|
||||||
|
let mut results = Vec::new();
|
||||||
|
|
||||||
|
// Process each issue
|
||||||
|
for (index, issue_content) in issues.iter().enumerate() {
|
||||||
|
let issue_filename = format!("{}-issue-{}", filename, index + 1);
|
||||||
|
info!("Processing issue {}/{}: {}", index + 1, issues.len(), issue_filename);
|
||||||
|
|
||||||
|
// Parse issue content
|
||||||
|
match parse_markdown_with_frontmatter(issue_content) {
|
||||||
|
Ok((meta, body)) => {
|
||||||
|
// Skip draft issues
|
||||||
|
if meta.status.to_lowercase() == "draft" {
|
||||||
|
info!("Skipping draft issue: {}", meta.title);
|
||||||
|
println!(" Skipping draft issue: {}", meta.title);
|
||||||
|
results.push(IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename: issue_filename,
|
||||||
|
meta,
|
||||||
|
success: false,
|
||||||
|
issue_url: None,
|
||||||
|
error: Some("Issue marked as draft".to_string()),
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create GitHub issue
|
||||||
|
let result = if dry_run {
|
||||||
|
info!("[DRY RUN] Would create issue: {}", meta.title);
|
||||||
|
println!(" [DRY RUN] Would create issue: {}", meta.title);
|
||||||
|
IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename: issue_filename,
|
||||||
|
meta,
|
||||||
|
success: true,
|
||||||
|
issue_url: Some("https://github.com/dry-run/issue/1".to_string()),
|
||||||
|
error: None,
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
create_github_issue(filepath, repo, &meta, &body, issue_filename)?
|
||||||
|
};
|
||||||
|
|
||||||
|
results.push(result);
|
||||||
|
},
|
||||||
|
Err(e) => {
|
||||||
|
error!("Error parsing issue {}: {}", index + 1, e);
|
||||||
|
eprintln!(" Error parsing issue {}: {}", index + 1, e);
|
||||||
|
results.push(IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename: issue_filename,
|
||||||
|
meta: IssueMeta {
|
||||||
|
title: format!("Failed to parse issue #{}", index + 1),
|
||||||
|
status: "error".to_string(),
|
||||||
|
project: None,
|
||||||
|
labels: Vec::new(),
|
||||||
|
assignees: Vec::new(),
|
||||||
|
parent: None,
|
||||||
|
milestone: None,
|
||||||
|
},
|
||||||
|
success: false,
|
||||||
|
issue_url: None,
|
||||||
|
error: Some(format!("Failed to parse issue: {}", e)),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(results)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Process a single issue from a markdown file
|
||||||
|
fn process_single_issue(filepath: &Path, content: &str, repo: &str, dry_run: bool) -> Result<IssueResult> {
|
||||||
|
// Parse front matter and content
|
||||||
|
let (meta, body) = parse_markdown_with_frontmatter(content)?;
|
||||||
|
let filename = filepath.file_name().unwrap().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Skip draft issues
|
||||||
|
if meta.status.to_lowercase() == "draft" {
|
||||||
|
println!(" Skipping draft issue: {}", meta.title);
|
||||||
|
return Ok(IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename,
|
||||||
|
meta,
|
||||||
|
success: false,
|
||||||
|
issue_url: None,
|
||||||
|
error: Some("Issue marked as draft".to_string()),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create GitHub issue
|
||||||
|
let result = if dry_run {
|
||||||
|
println!(" [DRY RUN] Would create issue: {}", meta.title);
|
||||||
|
IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename,
|
||||||
|
meta,
|
||||||
|
success: true,
|
||||||
|
issue_url: Some("https://github.com/dry-run/issue/1".to_string()),
|
||||||
|
error: None,
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
create_github_issue(filepath, repo, &meta, &body, filename)?
|
||||||
|
};
|
||||||
|
|
||||||
|
Ok(result)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Parse markdown file with YAML front matter
|
||||||
|
fn parse_markdown_with_frontmatter(content: &str) -> Result<(IssueMeta, String)> {
|
||||||
|
// Front matter should be at the start of the file between --- delimiters
|
||||||
|
// Use a more lenient regex that allows whitespace before the start
|
||||||
|
let front_matter_regex = Regex::new(r"(?s)\s*^---\s*\n(.*?)\n---\s*\n(.*)")?;
|
||||||
|
|
||||||
|
if let Some(captures) = front_matter_regex.captures(content.trim()) {
|
||||||
|
let front_matter = captures.get(1).unwrap().as_str();
|
||||||
|
let body = captures.get(2).unwrap().as_str().trim().to_string();
|
||||||
|
|
||||||
|
// Parse YAML front matter
|
||||||
|
let meta: IssueMeta = serde_yaml::from_str(front_matter)
|
||||||
|
.context("Failed to parse YAML front matter")?;
|
||||||
|
|
||||||
|
Ok((meta, body))
|
||||||
|
} else {
|
||||||
|
anyhow::bail!("No valid front matter found in markdown file")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Create GitHub issue using gh CLI
|
||||||
|
fn create_github_issue(filepath: &Path, repo: &str, meta: &IssueMeta, body: &str, filename: String) -> Result<IssueResult> {
|
||||||
|
// Build gh issue create command
|
||||||
|
let mut cmd = Command::new("gh");
|
||||||
|
cmd.arg("issue")
|
||||||
|
.arg("create")
|
||||||
|
.arg("--repo")
|
||||||
|
.arg(repo)
|
||||||
|
.arg("--title")
|
||||||
|
.arg(&meta.title)
|
||||||
|
.arg("--body")
|
||||||
|
.arg(body);
|
||||||
|
|
||||||
|
// Add labels if present
|
||||||
|
if !meta.labels.is_empty() {
|
||||||
|
cmd.arg("--label");
|
||||||
|
let labels = meta.labels.join(",");
|
||||||
|
cmd.arg(labels);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add assignees if present
|
||||||
|
if !meta.assignees.is_empty() {
|
||||||
|
cmd.arg("--assignee");
|
||||||
|
let assignees = meta.assignees.join(",");
|
||||||
|
cmd.arg(assignees);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add milestone if present
|
||||||
|
if let Some(milestone) = &meta.milestone {
|
||||||
|
cmd.arg("--milestone");
|
||||||
|
cmd.arg(milestone);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add project if present
|
||||||
|
if let Some(project) = &meta.project {
|
||||||
|
cmd.arg("--project");
|
||||||
|
cmd.arg(project);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Execute command
|
||||||
|
println!(" Creating issue: {}", meta.title);
|
||||||
|
let output = cmd.output()?;
|
||||||
|
|
||||||
|
// Process result
|
||||||
|
if output.status.success() {
|
||||||
|
let issue_url = String::from_utf8(output.stdout)?.trim().to_string();
|
||||||
|
println!(" Created issue: {}", issue_url);
|
||||||
|
|
||||||
|
// Link to parent if specified
|
||||||
|
if let Some(parent) = &meta.parent {
|
||||||
|
link_to_parent_issue(repo, &issue_url, parent)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename,
|
||||||
|
meta: meta.clone(),
|
||||||
|
success: true,
|
||||||
|
issue_url: Some(issue_url),
|
||||||
|
error: None,
|
||||||
|
})
|
||||||
|
} else {
|
||||||
|
let error = String::from_utf8(output.stderr)?.trim().to_string();
|
||||||
|
eprintln!(" Failed to create issue: {}", error);
|
||||||
|
|
||||||
|
Ok(IssueResult {
|
||||||
|
filepath: filepath.to_path_buf(),
|
||||||
|
filename,
|
||||||
|
meta: meta.clone(),
|
||||||
|
success: false,
|
||||||
|
issue_url: None,
|
||||||
|
error: Some(error),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Link issue to parent issue using gh cli
|
||||||
|
fn link_to_parent_issue(repo: &str, issue_url: &str, parent: &str) -> Result<()> {
|
||||||
|
// Extract issue number from URL
|
||||||
|
let issue_number = extract_issue_number(issue_url)?;
|
||||||
|
let parent_issue = if parent.contains('/') {
|
||||||
|
parent.to_string()
|
||||||
|
} else {
|
||||||
|
format!("#{}", parent)
|
||||||
|
};
|
||||||
|
|
||||||
|
// Add comment linking to parent
|
||||||
|
let comment = format!("Child of {}", parent_issue);
|
||||||
|
|
||||||
|
let output = Command::new("gh")
|
||||||
|
.arg("issue")
|
||||||
|
.arg("comment")
|
||||||
|
.arg("--repo")
|
||||||
|
.arg(repo)
|
||||||
|
.arg(issue_number)
|
||||||
|
.arg("--body")
|
||||||
|
.arg(comment)
|
||||||
|
.output()?;
|
||||||
|
|
||||||
|
if !output.status.success() {
|
||||||
|
eprintln!(" Failed to link to parent issue: {}", String::from_utf8(output.stderr)?);
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Extract issue number from GitHub issue URL
|
||||||
|
fn extract_issue_number(url: &str) -> Result<String> {
|
||||||
|
let re = Regex::new(r"/issues/(\d+)$")?;
|
||||||
|
if let Some(captures) = re.captures(url) {
|
||||||
|
Ok(captures.get(1).unwrap().as_str().to_string())
|
||||||
|
} else {
|
||||||
|
Ok(url.trim_start_matches('#').to_string())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Write report of processed files
|
||||||
|
fn write_report(report_dir: &Path, results: &[IssueResult]) -> Result<()> {
|
||||||
|
// Create summary file
|
||||||
|
let mut summary = File::create(report_dir.join("summary.md"))?;
|
||||||
|
writeln!(summary, "# GitHub Issue Creation Report")?;
|
||||||
|
writeln!(summary, "\nGenerated on: {}\n", Local::now().format("%Y-%m-%d %H:%M:%S"))?;
|
||||||
|
writeln!(summary, "## Summary")?;
|
||||||
|
writeln!(summary, "- Total issues processed: {}", results.len())?;
|
||||||
|
writeln!(summary, "- Successfully created: {}", results.iter().filter(|r| r.success).count())?;
|
||||||
|
writeln!(summary, "- Failed: {}", results.iter().filter(|r| !r.success).count())?;
|
||||||
|
|
||||||
|
writeln!(summary, "\n## Details")?;
|
||||||
|
for result in results {
|
||||||
|
let status = if result.success { "✅" } else { "❌" };
|
||||||
|
writeln!(summary, "### {} {}", status, result.meta.title)?;
|
||||||
|
writeln!(summary, "- File: {} ({})", result.filepath.display(), result.filename)?;
|
||||||
|
|
||||||
|
if let Some(url) = &result.issue_url {
|
||||||
|
writeln!(summary, "- Issue: {}", url)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(error) = &result.error {
|
||||||
|
writeln!(summary, "- Error: {}", error)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
writeln!(summary)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Copy all processed files with results
|
||||||
|
write_individual_reports(report_dir, results)?;
|
||||||
|
|
||||||
|
// Copy batch files with their content
|
||||||
|
write_batch_reports(report_dir, results)?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Write individual report files for each processed issue
|
||||||
|
fn write_individual_reports(report_dir: &Path, results: &[IssueResult]) -> Result<()> {
|
||||||
|
// Group results by filepath
|
||||||
|
let mut filepath_map: std::collections::HashMap<String, Vec<&IssueResult>> = std::collections::HashMap::new();
|
||||||
|
|
||||||
|
for result in results {
|
||||||
|
let path_str = result.filepath.to_string_lossy().to_string();
|
||||||
|
filepath_map.entry(path_str).or_default().push(result);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process each filepath
|
||||||
|
for (path_str, file_results) in filepath_map {
|
||||||
|
let _path = Path::new(&path_str);
|
||||||
|
|
||||||
|
// Skip batch files - they'll be handled separately
|
||||||
|
if file_results.len() > 1 && file_results[0].filepath == file_results[1].filepath {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Single issue file
|
||||||
|
if let Some(result) = file_results.first() {
|
||||||
|
let dest_path = report_dir.join(&result.filename);
|
||||||
|
|
||||||
|
// Read original content
|
||||||
|
let mut content = fs::read_to_string(&result.filepath)?;
|
||||||
|
|
||||||
|
// Add result information
|
||||||
|
if !result.success {
|
||||||
|
content = format!("---\n# FAILED: {}\n---\n\n{}",
|
||||||
|
result.error.as_deref().unwrap_or("Unknown error"),
|
||||||
|
content);
|
||||||
|
} else if let Some(url) = &result.issue_url {
|
||||||
|
content = format!("---\n# CREATED: {}\n---\n\n{}", url, content);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write to report directory
|
||||||
|
fs::write(dest_path, content)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Write batch reports for files containing multiple issues
|
||||||
|
fn write_batch_reports(report_dir: &Path, results: &[IssueResult]) -> Result<()> {
|
||||||
|
// Group results by filepath
|
||||||
|
let mut filepath_map: std::collections::HashMap<String, Vec<&IssueResult>> = std::collections::HashMap::new();
|
||||||
|
|
||||||
|
for result in results {
|
||||||
|
let path_str = result.filepath.to_string_lossy().to_string();
|
||||||
|
filepath_map.entry(path_str).or_default().push(result);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process each filepath with multiple results (batch files)
|
||||||
|
for (path_str, file_results) in filepath_map {
|
||||||
|
let path = Path::new(&path_str);
|
||||||
|
|
||||||
|
// Only process batch files (files with multiple issues)
|
||||||
|
if file_results.len() > 1 && file_results[0].filepath == file_results[1].filepath {
|
||||||
|
let filename = path.file_name().unwrap().to_string_lossy().to_string();
|
||||||
|
let dest_path = report_dir.join(filename);
|
||||||
|
|
||||||
|
// Read original content
|
||||||
|
let content = fs::read_to_string(path)?;
|
||||||
|
|
||||||
|
// Split by issue delimiter
|
||||||
|
let mut parts: Vec<&str> = content.split("---ISSUE---").collect();
|
||||||
|
|
||||||
|
// First part is the file header/intro
|
||||||
|
let header = parts.remove(0);
|
||||||
|
|
||||||
|
// Create new content with results
|
||||||
|
let mut new_content = header.to_string();
|
||||||
|
|
||||||
|
for (part, result) in parts.iter().zip(file_results.iter()) {
|
||||||
|
let issue_marker = "\n---ISSUE---\n";
|
||||||
|
let status_comment = if !result.success {
|
||||||
|
format!("# FAILED: {}\n", result.error.as_deref().unwrap_or("Unknown error"))
|
||||||
|
} else if let Some(url) = &result.issue_url {
|
||||||
|
format!("# CREATED: {}\n", url)
|
||||||
|
} else {
|
||||||
|
String::new()
|
||||||
|
};
|
||||||
|
|
||||||
|
new_content.push_str(issue_marker);
|
||||||
|
new_content.push_str(&status_comment);
|
||||||
|
new_content.push_str(part);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write to report directory
|
||||||
|
fs::write(dest_path, new_content)?;
|
||||||
|
|
||||||
|
// Also create individual files for each issue in the batch
|
||||||
|
for result in &file_results {
|
||||||
|
// Parse front matter and content from the original file
|
||||||
|
let original_content = fs::read_to_string(&result.filepath)?;
|
||||||
|
let parts: Vec<&str> = original_content.split("---ISSUE---").skip(1).collect();
|
||||||
|
|
||||||
|
if let Some(part) = parts.get(file_results.iter().position(|r| r.filename == result.filename).unwrap_or(0)) {
|
||||||
|
let dest_path = report_dir.join(&result.filename);
|
||||||
|
|
||||||
|
// Add result information
|
||||||
|
let content = if !result.success {
|
||||||
|
format!("---\n# FAILED: {}\n---\n\n{}",
|
||||||
|
result.error.as_deref().unwrap_or("Unknown error"),
|
||||||
|
part)
|
||||||
|
} else if let Some(url) = &result.issue_url {
|
||||||
|
format!("---\n# CREATED: {}\n---\n\n{}", url, part)
|
||||||
|
} else {
|
||||||
|
part.to_string()
|
||||||
|
};
|
||||||
|
|
||||||
|
// Write to report directory
|
||||||
|
fs::write(dest_path, content)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
6
Cargo.toml
Normal file
6
Cargo.toml
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
[workspace]
|
||||||
|
members = [
|
||||||
|
"00-jira-to-gh-issues",
|
||||||
|
"01-gh-issue-generator",
|
||||||
|
]
|
||||||
|
resolver = "2"
|
||||||
@@ -1,2 +1,7 @@
|
|||||||
# 2ticketss
|
# 2ticketss
|
||||||
|
|
||||||
|
This repository contains two complementary command-line tools for GitHub issue management:
|
||||||
|
|
||||||
|
- **00-jira-to-gh-issues**: A Rust tool that converts Jira CSV exports to GitHub issue markdown files compatible with gh-issue-generator. It handles messy CSV data and preserves issue metadata.
|
||||||
|
|
||||||
|
- **01-gh-issue-generator**: A Rust tool that creates GitHub issues from Markdown files with YAML front matter. It parses structured Markdown, supports batch processing, and integrates with GitHub CLI.
|
||||||
|
|||||||
Reference in New Issue
Block a user