SHA-224 Go SDK
Our Go SDK provides a high-performance implementation of the SHA-224 cryptographic hash function, designed with Go's concurrency model in mind.
Standard Library Integration
Seamlessly integrates with Go's crypto package interfaces.
Concurrent Processing
Optimized for concurrent hashing operations using goroutines.
Stream Processing
Efficiently process large files and io.Readers with low memory usage.
Zero Dependencies
Pure Go implementation with no external dependencies.
Production Ready
Comprehensive testing, benchmarks, and examples for production use.
REST API Integration
Built-in client for the SHA224.com REST API.
Installation
Install the SHA-224 Go SDK using go get
:
go get github.com/sha224-org/sha224-go
The SDK requires Go 1.13 or higher.
Importing the Package
import (
"github.com/sha224-org/sha224-go"
)
// For the API client
import (
"github.com/sha224-org/sha224-go/api"
)
Verification
To verify the installation, run the following simple test:
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
)
func main() {
hash := sha224.Sum("Hello, world!")
fmt.Printf("SHA-224 hash: %x\n", hash)
}
Quick Start
Generate SHA-224 hashes with just a few lines of code:
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"io"
"os"
)
func main() {
// Hash a string
hash := sha224.Sum([]byte("Hello, world!"))
fmt.Printf("Hash: %x\n", hash) // 8552d8b7a7dc5476cb9e25dee69a8091290764b7f2a64fe6e78e9568
// Hash a string (convenience method)
strHash := sha224.SumString("Hello, world!")
fmt.Printf("String hash: %x\n", strHash)
// Incremental hashing
h := sha224.New()
h.Write([]byte("Hello, "))
h.Write([]byte("world!"))
incrementalHash := h.Sum(nil)
fmt.Printf("Incremental hash: %x\n", incrementalHash) // Same as above
// Hash a file
file, err := os.Open("document.pdf")
if err != nil {
fmt.Println("Error opening file:", err)
return
}
defer file.Close()
fileHash, err := sha224.SumReader(file)
if err != nil {
fmt.Println("Error hashing file:", err)
return
}
fmt.Printf("File hash: %x\n", fileHash)
// Verify a hash
expected := "8552d8b7a7dc5476cb9e25dee69a8091290764b7f2a64fe6e78e9568"
isValid := sha224.Verify(expected, []byte("Hello, world!"))
fmt.Println("Hash is valid:", isValid) // true
}
Using the API Client
package main
import (
"fmt"
"github.com/sha224-org/sha224-go/api"
)
func main() {
// Initialize the API client
client := api.NewClient("YOUR_API_KEY")
// Hash using the API
apiHash, err := client.HashText("Hello, world!")
if err != nil {
fmt.Println("Error:", err)
return
}
fmt.Println("API hash:", apiHash)
// Verify a hash using the API
expected := "8552d8b7a7dc5476cb9e25dee69a8091290764b7f2a64fe6e78e9568"
isValid, err := client.Verify(expected, "Hello, world!")
if err != nil {
fmt.Println("Error:", err)
return
}
fmt.Println("API verification:", isValid) // true
}
API Reference
sha224.Sum(data []byte) [28]byte
Computes a SHA-224 hash of the input data and returns it as a fixed-size array.
Parameters
data []byte
- The data to hash
Returns
[28]byte
- The SHA-224 hash of the input as a fixed-size array
Example
// Hash a byte slice
data := []byte("Hello, world!")
hash := sha224.Sum(data)
fmt.Printf("%x\n", hash)
// Convert to hex string
hexHash := fmt.Sprintf("%x", hash)
sha224.SumString(s string) [28]byte
Convenience function to compute a SHA-224 hash of a string.
Parameters
s string
- The string to hash
Returns
[28]byte
- The SHA-224 hash of the input string as a fixed-size array
Example
// Hash a string directly
hash := sha224.SumString("Hello, world!")
fmt.Printf("%x\n", hash)
sha224.SumReader(r io.Reader) ([28]byte, error)
Computes a SHA-224 hash of the data read from an io.Reader.
Parameters
r io.Reader
- The reader to read data from
Returns
[28]byte
- The SHA-224 hash of the read data as a fixed-size arrayerror
- Any error encountered during reading
Example
// Hash a file
file, err := os.Open("document.pdf")
if err != nil {
log.Fatal(err)
}
defer file.Close()
hash, err := sha224.SumReader(file)
if err != nil {
log.Fatal(err)
}
fmt.Printf("File hash: %x\n", hash)
sha224.SumReaderWithProgress(r io.Reader, size int64, progress func(int64, int64)) ([28]byte, error)
Computes a SHA-224 hash of the data read from an io.Reader while reporting progress.
Parameters
r io.Reader
- The reader to read data fromsize int64
- The total size of the data (or -1 if unknown)progress func(int64, int64)
- Callback function that receives the current bytes read and total size
Returns
[28]byte
- The SHA-224 hash of the read data as a fixed-size arrayerror
- Any error encountered during reading
Example
// Hash a file with progress reporting
file, err := os.Open("large-file.bin")
if err != nil {
log.Fatal(err)
}
defer file.Close()
fileInfo, err := file.Stat()
if err != nil {
log.Fatal(err)
}
hash, err := sha224.SumReaderWithProgress(file, fileInfo.Size(), func(bytesRead, totalSize int64) {
progress := float64(bytesRead) / float64(totalSize) * 100
fmt.Printf("Progress: %.2f%%\n", progress)
})
if err != nil {
log.Fatal(err)
}
fmt.Printf("File hash: %x\n", hash)
sha224.SumFile(path string) ([28]byte, error)
Computes a SHA-224 hash of a file specified by its path.
Parameters
path string
- The path to the file to hash
Returns
[28]byte
- The SHA-224 hash of the file as a fixed-size arrayerror
- Any error encountered during file operations
Example
// Hash a file by path
hash, err := sha224.SumFile("document.pdf")
if err != nil {
log.Fatal(err)
}
fmt.Printf("File hash: %x\n", hash)
sha224.New() hash.Hash
Creates a new hash.Hash computing the SHA-224 checksum.
Returns
hash.Hash
- A hash.Hash instance for SHA-224
Example
// Create a hash instance
h := sha224.New()
// Write data incrementally
h.Write([]byte("Part 1"))
h.Write([]byte("Part 2"))
// Get the final hash
sum := h.Sum(nil)
fmt.Printf("%x\n", sum)
// Reset and reuse
h.Reset()
h.Write([]byte("New data"))
newSum := h.Sum(nil)
sha224.Verify(expected string, data []byte) bool
Verifies if the provided data matches the expected hash (hex string).
Parameters
expected string
- The expected SHA-224 hash as a hex stringdata []byte
- The data to verify
Returns
bool
- True if the hash matches, false otherwise
Example
// Verify a hash
expected := "8552d8b7a7dc5476cb9e25dee69a8091290764b7f2a64fe6e78e9568"
isValid := sha224.Verify(expected, []byte("Hello, world!"))
fmt.Println("Valid:", isValid) // true
sha224.VerifyFile(expected string, path string) (bool, error)
Verifies if a file matches the expected hash (hex string).
Parameters
expected string
- The expected SHA-224 hash as a hex stringpath string
- The path to the file to verify
Returns
bool
- True if the hash matches, false otherwiseerror
- Any error encountered during file operations
Example
// Verify a file
expected := "8552d8b7a7dc5476cb9e25dee69a8091290764b7f2a64fe6e78e9568"
isValid, err := sha224.VerifyFile(expected, "document.pdf")
if err != nil {
log.Fatal(err)
}
fmt.Println("File is valid:", isValid)
sha224/api.NewClient(apiKey string) *Client
Creates a new client for the SHA224.com REST API.
Parameters
apiKey string
- Your SHA224.com API key
Returns
*Client
- A client instance for interacting with the API
Example
// Create a new API client
client := api.NewClient("YOUR_API_KEY")
// Configure with options
client.SetBaseURL("https://api.sha224.com/v1")
client.SetTimeout(5 * time.Second)
Client Methods
The API client provides methods for interacting with the SHA224.com API.
Methods
HashText(text string) (string, error)
- Hash text using the APIHashTextWithEncoding(text, encoding string) (string, error)
- Hash text with specified encodingHashFile(path string) (string, error)
- Hash a file using the APIHashFileWithProgress(path string, progress func(int64, int64)) (string, error)
- Hash with progress trackingVerify(expected, text string) (bool, error)
- Verify a hash using the APIVerifyFile(expected, path string) (bool, error)
- Verify a file hashBatchHash(items []BatchItem) ([]BatchResult, error)
- Batch hash multiple items
Example
// Use the API client
client := api.NewClient("YOUR_API_KEY")
// Hash text
hash, err := client.HashText("Hello, world!")
if err != nil {
log.Fatal(err)
}
fmt.Println("Hash:", hash)
// Verify a hash
isValid, err := client.Verify(hash, "Hello, world!")
if err != nil {
log.Fatal(err)
}
fmt.Println("Valid:", isValid)
// Batch operations
items := []api.BatchItem{
{ID: "item1", Text: "First item"},
{ID: "item2", Text: "Second item"},
{ID: "item3", Text: "Third item"},
}
results, err := client.BatchHash(items)
if err != nil {
log.Fatal(err)
}
for _, result := range results {
fmt.Printf("%s: %s\n", result.ID, result.Hash)
}
Examples
Basic Usage Examples
Hashing Different Data Types
package main
import (
"encoding/hex"
"encoding/base64"
"fmt"
"github.com/sha224-org/sha224-go"
)
func main() {
// Hash a byte slice
data := []byte("Hello, world!")
hash := sha224.Sum(data)
fmt.Printf("Byte slice hash: %x\n", hash)
// Hash a string directly
strHash := sha224.SumString("Hello, world!")
fmt.Printf("String hash: %x\n", strHash)
// Hash an integer
intBytes := make([]byte, 4)
intBytes[0] = byte(12345 >> 24)
intBytes[1] = byte(12345 >> 16)
intBytes[2] = byte(12345 >> 8)
intBytes[3] = byte(12345)
intHash := sha224.Sum(intBytes)
fmt.Printf("Integer hash: %x\n", intHash)
// Different output formats
hexStr := hex.EncodeToString(hash[:])
fmt.Println("Hex:", hexStr)
base64Str := base64.StdEncoding.EncodeToString(hash[:])
fmt.Println("Base64:", base64Str)
}
Incremental Hashing
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
)
func main() {
// Create a hasher
h := sha224.New()
// Write data incrementally
h.Write([]byte("Part 1: "))
h.Write([]byte("Part 2: "))
h.Write([]byte("Part 3"))
// Get the final hash
sum := h.Sum(nil)
fmt.Printf("Incremental hash: %x\n", sum)
// Get the hash size and block size
fmt.Printf("Hash size: %d bytes\n", h.Size())
fmt.Printf("Block size: %d bytes\n", h.BlockSize())
// Reset and reuse
h.Reset()
h.Write([]byte("Fresh start"))
newSum := h.Sum(nil)
fmt.Printf("After reset: %x\n", newSum)
}
Hash Verification
package main
import (
"crypto/subtle"
"fmt"
"github.com/sha224-org/sha224-go"
)
func main() {
// Create a hash
data := []byte("Important data to verify")
hash := sha224.Sum(data)
hexHash := fmt.Sprintf("%x", hash)
fmt.Println("Original hash:", hexHash)
// Verify the data hasn't been modified
isValid := sha224.Verify(hexHash, data)
fmt.Println("Valid data:", isValid) // true
// Verify modified data
modifiedData := []byte("Important data to verify!")
isModifiedValid := sha224.Verify(hexHash, modifiedData)
fmt.Println("Valid modified data:", isModifiedValid) // false
// Manual secure comparison (constant-time)
otherHash := sha224.Sum([]byte("Important data to verify"))
// Use subtle.ConstantTimeCompare for secure comparison
secureCompare := subtle.ConstantTimeCompare(hash[:], otherHash[:]) == 1
fmt.Println("Secure comparison:", secureCompare) // true
}
File Operation Examples
Hashing Files
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"log"
"os"
)
func main() {
// Hash a file by path
fileHash, err := sha224.SumFile("document.pdf")
if err != nil {
log.Fatal(err)
}
fmt.Printf("File hash: %x\n", fileHash)
// Hash a file with progress reporting
bigFile, err := os.Open("large-file.bin")
if err != nil {
log.Fatal(err)
}
defer bigFile.Close()
fileInfo, err := bigFile.Stat()
if err != nil {
log.Fatal(err)
}
fileSize := fileInfo.Size()
// Rewind to the beginning
bigFile.Seek(0, 0)
hash, err := sha224.SumReaderWithProgress(bigFile, fileSize, func(bytesRead, totalSize int64) {
progress := float64(bytesRead) / float64(totalSize) * 100
fmt.Printf("\rProgress: %.2f%%", progress)
})
if err != nil {
log.Fatal(err)
}
fmt.Printf("\nLarge file hash: %x\n", hash)
}
Directory Hashing
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"io/ioutil"
"log"
"os"
"path/filepath"
"sort"
)
func main() {
dirPath := "./project-directory"
dirHash, err := hashDirectory(dirPath)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Directory hash: %x\n", dirHash)
}
func hashDirectory(dirPath string) ([28]byte, error) {
// Get all files recursively
var files []string
err := filepath.Walk(dirPath, func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if !info.IsDir() {
files = append(files, path)
}
return nil
})
if err != nil {
return [28]byte{}, err
}
// Sort files by path for consistency
sort.Strings(files)
// Create a master hasher
h := sha224.New()
// Hash each file and add its path and hash to the master hasher
for _, filePath := range files {
relPath, err := filepath.Rel(dirPath, filePath)
if err != nil {
return [28]byte{}, err
}
fileHash, err := sha224.SumFile(filePath)
if err != nil {
return [28]byte{}, err
}
// Add the relative path and hash to the master hasher
h.Write([]byte(relPath))
h.Write(fileHash[:])
fmt.Printf("Hashed: %s - %x\n", relPath, fileHash)
}
// Get the final hash
var result [28]byte
sum := h.Sum(nil)
copy(result[:], sum)
return result, nil
}
File Integrity Verification
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"io/ioutil"
"log"
"os"
"path/filepath"
)
func main() {
// Create a file to verify
filePath := "important-document.txt"
err := ioutil.WriteFile(filePath, []byte("This is important data that should not be modified."), 0644)
if err != nil {
log.Fatal(err)
}
// Calculate and save hash
hash, err := sha224.SumFile(filePath)
if err != nil {
log.Fatal(err)
}
hashHex := fmt.Sprintf("%x", hash)
hashFilePath := filePath + ".sha224"
err = ioutil.WriteFile(hashFilePath, []byte(hashHex), 0644)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Created file with hash: %s\n", hashHex)
// Later, verify the file
isValid, err := verifyFileIntegrity(filePath, hashFilePath)
if err != nil {
log.Fatal(err)
}
fmt.Println("File integrity valid:", isValid)
// Modify the file and check again
err = ioutil.WriteFile(filePath, []byte("This is important data that should not be modified. THIS IS A MODIFICATION!"), 0644)
if err != nil {
log.Fatal(err)
}
isStillValid, err := verifyFileIntegrity(filePath, hashFilePath)
if err != nil {
log.Fatal(err)
}
fmt.Println("Modified file integrity valid:", isStillValid) // Should be false
}
func verifyFileIntegrity(filePath, hashFilePath string) (bool, error) {
// Read the stored hash
hashData, err := ioutil.ReadFile(hashFilePath)
if err != nil {
return false, err
}
expectedHash := string(hashData)
// Verify the file
return sha224.VerifyFile(expectedHash, filePath)
}
Advanced Examples
Custom I/O Wrappers
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"io"
"log"
"os"
)
// HashingReader wraps an io.Reader to compute a hash while reading
type HashingReader struct {
reader io.Reader
hasher io.Writer
}
func NewHashingReader(r io.Reader) *HashingReader {
h := sha224.New()
return &HashingReader{
reader: r,
hasher: h,
}
}
func (hr *HashingReader) Read(p []byte) (n int, err error) {
n, err = hr.reader.Read(p)
if n > 0 {
hr.hasher.Write(p[:n])
}
return
}
func (hr *HashingReader) Sum() [28]byte {
h := hr.hasher.(sha224.Hash)
var result [28]byte
sum := h.Sum(nil)
copy(result[:], sum)
return result
}
// HashingWriter wraps an io.Writer to compute a hash while writing
type HashingWriter struct {
writer io.Writer
hasher io.Writer
}
func NewHashingWriter(w io.Writer) *HashingWriter {
h := sha224.New()
return &HashingWriter{
writer: w,
hasher: h,
}
}
func (hw *HashingWriter) Write(p []byte) (n int, err error) {
n, err = hw.writer.Write(p)
if n > 0 {
hw.hasher.Write(p[:n])
}
return
}
func (hw *HashingWriter) Sum() [28]byte {
h := hw.hasher.(sha224.Hash)
var result [28]byte
sum := h.Sum(nil)
copy(result[:], sum)
return result
}
func main() {
// Example with HashingReader
file, err := os.Open("document.pdf")
if err != nil {
log.Fatal(err)
}
defer file.Close()
hashingReader := NewHashingReader(file)
// Read and process the file
buffer := make([]byte, 1024)
for {
_, err := hashingReader.Read(buffer)
if err == io.EOF {
break
}
if err != nil {
log.Fatal(err)
}
// Process buffer data...
}
// Get the hash
hash := hashingReader.Sum()
fmt.Printf("File hash: %x\n", hash)
// Example with HashingWriter
outFile, err := os.Create("output.txt")
if err != nil {
log.Fatal(err)
}
defer outFile.Close()
hashingWriter := NewHashingWriter(outFile)
// Write data
data := []byte("This is test data that will be written to the file.")
_, err = hashingWriter.Write(data)
if err != nil {
log.Fatal(err)
}
// Get the hash of written data
writeHash := hashingWriter.Sum()
fmt.Printf("Written data hash: %x\n", writeHash)
}
Memory-Mapped Files
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"log"
"os"
"syscall"
)
func main() {
filePath := "large-file.bin"
// Hash with memory mapping for very large files
hash, err := hashWithMemoryMapping(filePath)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Memory-mapped hash: %x\n", hash)
}
func hashWithMemoryMapping(filePath string) ([28]byte, error) {
// Open the file
file, err := os.Open(filePath)
if err != nil {
return [28]byte{}, err
}
defer file.Close()
// Get file info for size
fileInfo, err := file.Stat()
if err != nil {
return [28]byte{}, err
}
size := fileInfo.Size()
if size == 0 {
return sha224.Sum([]byte{}), nil
}
// Memory map the file
mmap, err := syscall.Mmap(
int(file.Fd()),
0,
int(size),
syscall.PROT_READ,
syscall.MAP_SHARED,
)
if err != nil {
return [28]byte{}, err
}
// Ensure munmap
defer syscall.Munmap(mmap)
// Hash the mapped memory
return sha224.Sum(mmap), nil
}
Content-Addressable Storage
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"io"
"io/ioutil"
"log"
"os"
"path/filepath"
)
// ContentStore implements a simple content-addressable storage
type ContentStore struct {
basePath string
}
// NewContentStore creates a new content-addressable storage
func NewContentStore(basePath string) (*ContentStore, error) {
err := os.MkdirAll(basePath, 0755)
if err != nil {
return nil, err
}
return &ContentStore{basePath: basePath}, nil
}
// Store adds a file to the content store
func (cs *ContentStore) Store(filePath string) (string, error) {
// Calculate the hash
hash, err := sha224.SumFile(filePath)
if err != nil {
return "", err
}
hashHex := fmt.Sprintf("%x", hash)
storedPath := filepath.Join(cs.basePath, hashHex)
// Check if the file already exists
if _, err := os.Stat(storedPath); err == nil {
fmt.Printf("File already exists in store: %s\n", hashHex)
return hashHex, nil
}
// Copy the file
err = copyFile(filePath, storedPath)
if err != nil {
return "", err
}
fmt.Printf("Stored file with hash: %s\n", hashHex)
return hashHex, nil
}
// Retrieve gets a file from the content store
func (cs *ContentStore) Retrieve(hashHex string) (string, error) {
storedPath := filepath.Join(cs.basePath, hashHex)
// Check if the file exists
if _, err := os.Stat(storedPath); os.IsNotExist(err) {
return "", fmt.Errorf("file with hash %s not found", hashHex)
}
return storedPath, nil
}
// Verify checks if a file matches its expected hash
func (cs *ContentStore) Verify(hashHex, filePath string) (bool, error) {
return sha224.VerifyFile(hashHex, filePath)
}
// Helper function to copy a file
func copyFile(src, dst string) error {
sourceFile, err := os.Open(src)
if err != nil {
return err
}
defer sourceFile.Close()
destFile, err := os.Create(dst)
if err != nil {
return err
}
defer destFile.Close()
_, err = io.Copy(destFile, sourceFile)
return err
}
func main() {
// Create a content store
store, err := NewContentStore("./content_store")
if err != nil {
log.Fatal(err)
}
// Create some test files
for i := 1; i <= 3; i++ {
filePath := fmt.Sprintf("file%d.txt", i)
content := fmt.Sprintf("This is test file %d content.", i)
err := ioutil.WriteFile(filePath, []byte(content), 0644)
if err != nil {
log.Fatal(err)
}
// Store the file
hash, err := store.Store(filePath)
if err != nil {
log.Fatal(err)
}
fmt.Printf("File %d stored with hash: %s\n", i, hash)
// Verify the stored file
isValid, err := store.Verify(hash, filePath)
if err != nil {
log.Fatal(err)
}
fmt.Printf("File %d verification: %v\n", i, isValid)
}
// Try to retrieve and verify
hashToRetrieve := "7d92e8759c21ff6243e249061231d33fbf16cd78b38d8245741bd128" // Example hash
filePath, err := store.Retrieve(hashToRetrieve)
if err == nil {
fmt.Printf("Retrieved file: %s\n", filePath)
} else {
fmt.Printf("Retrieval error: %s\n", err)
}
}
API Client Examples
Basic API Usage
package main
import (
"fmt"
"github.com/sha224-org/sha224-go/api"
"log"
"time"
)
func main() {
// Initialize the API client
client := api.NewClient("YOUR_API_KEY")
// Configure the client
client.SetBaseURL("https://api.sha224.com/v1")
client.SetTimeout(5 * time.Second)
// Hash text using the API
hash, err := client.HashText("Hello, world!")
if err != nil {
log.Fatal("API hash error:", err)
}
fmt.Println("API hash:", hash)
// Hash with custom encoding
latinHash, err := client.HashTextWithEncoding("Café", "latin1")
if err != nil {
log.Fatal("API encoding hash error:", err)
}
fmt.Println("Latin1 encoded hash:", latinHash)
// Verify a hash
isValid, err := client.Verify(hash, "Hello, world!")
if err != nil {
log.Fatal("API verify error:", err)
}
fmt.Println("Hash verification:", isValid) // true
}
Batch Operations
package main
import (
"fmt"
"github.com/sha224-org/sha224-go/api"
"log"
)
func main() {
// Initialize the API client
client := api.NewClient("YOUR_API_KEY")
// Create batch items
items := []api.BatchItem{
{ID: "item1", Text: "First item"},
{ID: "item2", Text: "Second item"},
{ID: "item3", Text: "Third item"},
}
// Batch hash
results, err := client.BatchHash(items)
if err != nil {
log.Fatal("Batch hash error:", err)
}
fmt.Println("Batch hash results:")
for _, result := range results {
fmt.Printf("ID: %s, Hash: %s\n", result.ID, result.Hash)
}
// Create verification items
verifyItems := []api.BatchVerifyItem{
{
ID: "item1",
Hash: results[0].Hash,
Text: "First item",
},
{
ID: "item2",
Hash: results[1].Hash,
Text: "Modified item", // This should fail verification
},
{
ID: "item3",
Hash: results[2].Hash,
Text: "Third item",
},
}
// Batch verify
verifyResults, err := client.BatchVerify(verifyItems)
if err != nil {
log.Fatal("Batch verify error:", err)
}
fmt.Println("\nBatch verify results:")
for _, result := range verifyResults {
fmt.Printf("ID: %s, Valid: %t\n", result.ID, result.Valid)
}
}
File Hashing with Progress
package main
import (
"fmt"
"github.com/sha224-org/sha224-go/api"
"log"
)
func main() {
// Initialize the API client
client := api.NewClient("YOUR_API_KEY")
// Hash a file with progress reporting
hash, err := client.HashFileWithProgress("large-file.bin", func(bytesRead, totalSize int64) {
if totalSize > 0 {
progress := float64(bytesRead) / float64(totalSize) * 100
fmt.Printf("\rProgress: %.2f%%", progress)
} else {
fmt.Printf("\rBytes read: %d", bytesRead)
}
})
fmt.Println() // New line after progress
if err != nil {
log.Fatal("File hash error:", err)
}
fmt.Println("File hash:", hash)
// Verify the file
isValid, err := client.VerifyFile(hash, "large-file.bin")
if err != nil {
log.Fatal("File verify error:", err)
}
fmt.Println("File verification:", isValid)
}
Error Handling and Retries
package main
import (
"fmt"
"github.com/sha224-org/sha224-go/api"
"log"
"net/http"
"time"
)
// Custom HTTP client with retries
type RetryTransport struct {
MaxRetries int
BaseBackoff time.Duration
transport http.RoundTripper
}
func NewRetryTransport(maxRetries int, baseBackoff time.Duration) *RetryTransport {
return &RetryTransport{
MaxRetries: maxRetries,
BaseBackoff: baseBackoff,
transport: http.DefaultTransport,
}
}
func (rt *RetryTransport) RoundTrip(req *http.Request) (*http.Response, error) {
var resp *http.Response
var err error
for attempt := 0; attempt <= rt.MaxRetries; attempt++ {
// Clone the request body for retry
if attempt > 0 {
fmt.Printf("Retry attempt %d/%d\n", attempt, rt.MaxRetries)
time.Sleep(rt.BaseBackoff * time.Duration(1<
Concurrency Patterns
Go's concurrency model works exceptionally well with CPU-intensive tasks like hashing. This section presents patterns for concurrent hash operations.
Worker Pool for Multiple Files
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"log"
"os"
"path/filepath"
"sync"
)
// Result represents a hashing result
type Result struct {
Path string
Hash [28]byte
Err error
}
func main() {
// Directory to scan
dirPath := "./documents"
// Number of worker goroutines
numWorkers := 4
// Get all files to hash
var filePaths []string
err := filepath.Walk(dirPath, func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if !info.IsDir() {
filePaths = append(filePaths, path)
}
return nil
})
if err != nil {
log.Fatal(err)
}
fmt.Printf("Found %d files to hash\n", len(filePaths))
// Create input and output channels
jobs := make(chan string, len(filePaths))
results := make(chan Result, len(filePaths))
// Start worker pool
var wg sync.WaitGroup
wg.Add(numWorkers)
for i := 0; i < numWorkers; i++ {
go func(id int) {
defer wg.Done()
worker(id, jobs, results)
}(i)
}
// Send jobs to the workers
for _, path := range filePaths {
jobs <- path
}
close(jobs)
// Wait for all workers to complete
go func() {
wg.Wait()
close(results)
}()
// Collect results
for result := range results {
if result.Err != nil {
fmt.Printf("Error hashing %s: %v\n", result.Path, result.Err)
} else {
fmt.Printf("%s: %x\n", result.Path, result.Hash)
}
}
}
// worker processes files from jobs channel and sends results to results channel
func worker(id int, jobs <-chan string, results chan<- Result) {
for path := range jobs {
hash, err := sha224.SumFile(path)
results <- Result{
Path: path,
Hash: hash,
Err: err,
}
}
}
Fan-Out/Fan-In for Large Files
package main
import (
"fmt"
"github.com/sha224-org/sha224-go"
"io"
"log"
"os"
"sync"
)
// ChunkResult represents hashing result for a chunk
type ChunkResult struct {
Index int
Hash [28]byte
BytesRead int64
}
func main() {
filePath := "very-large-file.bin"
// Open the file
file, err := os.Open(filePath)
if err != nil {
log.Fatal(err)
}
defer file.Close()
// Get file size
fileInfo, err := file.Stat()
if err != nil {
log.Fatal(err)
}
fileSize := fileInfo.Size()
// Configure chunking
numChunks := 8
chunkSize := fileSize / int64(numChunks)
// Create channels
jobs := make(chan int, numChunks)
results := make(chan ChunkResult, numChunks)
// Start workers
var wg sync.WaitGroup
for i := 0; i < 4; i++ { // 4 worker goroutines
wg.Add(1)
go func() {
defer wg.Done()
for chunkIndex := range jobs {
offset := int64(chunkIndex) * chunkSize
length := chunkSize
// Adjust length for the last chunk
if chunkIndex == numChunks-1 {
length = fileSize - offset
}
hash, bytesRead, err := hashFileChunk(filePath, offset, length)
if err != nil {
log.Printf("Error hashing chunk %d: %v", chunkIndex, err)
continue
}
results <- ChunkResult{
Index: chunkIndex,
Hash: hash,
BytesRead: bytesRead,
}
}
}()
}
// Send jobs to workers
for i := 0; i < numChunks; i++ {
jobs <- i
}
close(jobs)
// Wait for all workers and close results
go func() {
wg.Wait()
close(results)
}()
// Collect chunk results
var chunkResults = make([]ChunkResult, numChunks)
var totalBytesRead int64
for result := range results {
chunkResults[result.Index] = result
totalBytesRead += result.BytesRead
fmt.Printf("Chunk %d processed: %d bytes\n", result.Index, result.BytesRead)
}
// Check if all chunks were processed
if totalBytesRead != fileSize {
log.Fatalf("Expected to read %d bytes, but read %d bytes", fileSize, totalBytesRead)
}
// Combine chunk hashes into a final hash
h := sha224.New()
for _, result := range chunkResults {
h.Write(result.Hash[:])
}
finalHash := h.Sum(nil)
fmt.Printf("Final hash for %s: %x\n", filePath, finalHash)
}
// hashFileChunk hashes a chunk of a file starting at offset and reading length bytes
func hashFileChunk(filePath string, offset, length int64) ([28]byte, int64, error) {
file, err := os.Open(filePath)
if err != nil {
return [28]byte{}, 0, err
}
defer file.Close()
// Seek to the chunk start
_, err = file.Seek(offset, 0)
if err != nil {
return [28]byte{}, 0, err
}
// Create a limited reader to read only the chunk
limitReader := io.LimitReader(file, length)
// Create hasher
h := sha224.New()
// Copy data from file to hasher
bytesRead, err := io.Copy(h, limitReader)
if err != nil {
return [28]byte{}, bytesRead, err
}
// Convert to fixed-size array
var result [28]byte
copy(result[:], h.Sum(nil))
return result, bytesRead, nil
}
Bounded Parallel Processing
package main
import (
"context"
"fmt"
"github.com/sha224-org/sha224-go"
"golang.org/x/sync/semaphore"
"io/ioutil"
"log"
"runtime"
"sync"
)
func main() {
// Create some test data
dataDir := "./testdata"
err := os.MkdirAll(dataDir, 0755)
if err != nil {
log.Fatal(err)
}
// Create 100 test files
var paths []string
for i := 0; i < 100; i++ {
path := fmt.Sprintf("%s/file%d.txt", dataDir, i)
content := fmt.Sprintf("Test file %d content", i)
err := ioutil.WriteFile(path, []byte(content), 0644)
if err != nil {
log.Fatal(err)
}
paths = append(paths, path)
}
// Hash files with bounded parallelism
results, err := hashFilesWithBoundedParallelism(paths)
if err != nil {
log.Fatal(err)
}
// Print results
for path, hash := range results {
fmt.Printf("%s: %x\n", path, hash)
}
}
// hashFilesWithBoundedParallelism hashes files with bounded parallelism
func hashFilesWithBoundedParallelism(paths []string) (map[string][28]byte, error) {
// Get number of CPUs
maxWorkers := int64(runtime.NumCPU())
// Create a semaphore to limit concurrency
sem := semaphore.NewWeighted(maxWorkers)
ctx := context.Background()
// Create a map to store results
results := make(map[string][28]byte)
var resultsMutex sync.Mutex
// Create a WaitGroup to wait for all goroutines
var wg sync.WaitGroup
var errorsSlice []error
var errorsMutex sync.Mutex
// Process files
for _, path := range paths {
// Acquire a semaphore slot
if err := sem.Acquire(ctx, 1); err != nil {
return nil, err
}
wg.Add(1)
go func(p string) {
defer wg.Done()
defer sem.Release(1)
// Hash the file
hash, err := sha224.SumFile(p)
if err != nil {
errorsMutex.Lock()
errorsSlice = append(errorsSlice, fmt.Errorf("hash %s: %w", p, err))
errorsMutex.Unlock()
return
}
// Store the result
resultsMutex.Lock()
results[p] = hash
resultsMutex.Unlock()
}(path)
}
// Wait for all goroutines to complete
wg.Wait()
// Check for errors
if len(errorsSlice) > 0 {
return results, fmt.Errorf("encountered %d errors during hashing", len(errorsSlice))
}
return results, nil
}
Frequently Asked Questions
Go's standard library includes SHA-256 in the crypto/sha256
package, but does not include a specific implementation for SHA-224. While SHA-224 is a truncated variant of SHA-256 with a different initialization vector, our SDK provides:
- Dedicated SHA-224 implementation: Specifically optimized for SHA-224
- Convenience methods: Purpose-built functions for common operations like file hashing and verification
- API client: Integration with the SHA224.com REST API
- Performance optimizations: Specialized algorithms for better throughput
The Go SDK is fully compatible with the standard library's interfaces like hash.Hash
, so it integrates seamlessly with existing code that expects these interfaces.
For the most efficient hashing of large files in Go, you have several options:
- Use SumFile(): The
sha224.SumFile()
function is optimized for file hashing and provides good performance for most cases. - Memory-mapped files: For very large files, memory mapping (as shown in the advanced examples) can provide better performance by letting the OS handle memory management.
- Concurrent chunk processing: For extremely large files, splitting the file into chunks and processing them concurrently (as shown in the Fan-Out/Fan-In example) can leverage multiple CPU cores.
- API client: For the largest files or when CPU resources are limited, offloading the work to the SHA224.com API can be more efficient.
Which approach is most efficient depends on your specific use case, file size, and available resources. For most applications, the built-in SumFile()
function will provide the best balance of simplicity and performance.
Go's concurrency model uses goroutines rather than threads, but the same safety considerations apply:
- Top-level functions: Functions like
Sum()
,SumFile()
, andVerify()
are safe to call concurrently from multiple goroutines. - Hash instances: Individual hash instances created with
New()
maintain their own state and should not be used concurrently without proper synchronization. Each goroutine should have its own hash instance. - API client: The API client is designed to be safe for concurrent use from multiple goroutines.
The library is designed to work well with Go's concurrency patterns and can be efficiently used in highly concurrent applications. The examples in the "Concurrency Patterns" section demonstrate how to effectively use the SDK in concurrent contexts.
Yes, the SHA-224 Go SDK is well-suited for use in web services and API servers. Here's a simple example using the standard net/http
package:
package main
import (
"encoding/json"
"github.com/sha224-org/sha224-go"
"io/ioutil"
"log"
"net/http"
)
type HashRequest struct {
Text string `json:"text"`
}
type HashResponse struct {
Hash string `json:"hash"`
}
func main() {
http.HandleFunc("/hash", hashHandler)
log.Println("Starting server on :8080")
log.Fatal(http.ListenAndServe(":8080", nil))
}
func hashHandler(w http.ResponseWriter, r *http.Request) {
// Only allow POST
if r.Method != http.MethodPost {
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
return
}
// Read request body
body, err := ioutil.ReadAll(r.Body)
if err != nil {
http.Error(w, "Error reading request body", http.StatusBadRequest)
return
}
// Parse JSON
var req HashRequest
if err := json.Unmarshal(body, &req); err != nil {
http.Error(w, "Invalid JSON", http.StatusBadRequest)
return
}
// Calculate hash
hash := sha224.SumString(req.Text)
// Create response
resp := HashResponse{
Hash: fmt.Sprintf("%x", hash),
}
// Send response
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(resp)
}
For production web services, consider adding rate limiting, request validation, and proper error handling. The SDK's performance characteristics make it suitable for high-throughput web services, and its concurrency patterns work well with Go's HTTP server model.
The performance comparison between local computation using the Go SDK and remote computation using the API depends on several factors:
Scenario | Local SDK | API |
---|---|---|
Small data (< 1MB) | Faster - No network latency | Slower - Network overhead outweighs computation |
Medium data (1-100MB) | Usually faster - Efficient local processing | Slower - Network transfer time significant |
Large data (> 100MB) | Varies - Depends on local CPU power | May be faster with good network connection and server power |
Resource-constrained devices | Slower - Limited by local resources | Faster - Offloads computation |
High volume (many hash operations) | Better for high throughput with parallelism | Limited by API rate limits and network constraints |
In general, for most Go applications, using the local SDK implementation provides the best performance. The API is most useful when you need to offload computation from resource-constrained devices or when you need consistent performance across different client environments.